Nov 29 00:35:39 np0005539504 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 00:35:39 np0005539504 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 00:35:39 np0005539504 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:39 np0005539504 kernel: BIOS-provided physical RAM map:
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 00:35:39 np0005539504 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 00:35:39 np0005539504 kernel: NX (Execute Disable) protection: active
Nov 29 00:35:39 np0005539504 kernel: APIC: Static calls initialized
Nov 29 00:35:39 np0005539504 kernel: SMBIOS 2.8 present.
Nov 29 00:35:39 np0005539504 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 00:35:39 np0005539504 kernel: Hypervisor detected: KVM
Nov 29 00:35:39 np0005539504 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 00:35:39 np0005539504 kernel: kvm-clock: using sched offset of 3543960198 cycles
Nov 29 00:35:39 np0005539504 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 00:35:39 np0005539504 kernel: tsc: Detected 2799.998 MHz processor
Nov 29 00:35:39 np0005539504 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 00:35:39 np0005539504 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 00:35:39 np0005539504 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 00:35:39 np0005539504 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 00:35:39 np0005539504 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 00:35:39 np0005539504 kernel: Using GB pages for direct mapping
Nov 29 00:35:39 np0005539504 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 00:35:39 np0005539504 kernel: ACPI: Early table checksum verification disabled
Nov 29 00:35:39 np0005539504 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 00:35:39 np0005539504 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:39 np0005539504 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:39 np0005539504 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:39 np0005539504 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 00:35:39 np0005539504 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:39 np0005539504 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:39 np0005539504 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 00:35:39 np0005539504 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 00:35:39 np0005539504 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 00:35:39 np0005539504 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 00:35:39 np0005539504 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 00:35:39 np0005539504 kernel: No NUMA configuration found
Nov 29 00:35:39 np0005539504 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 00:35:39 np0005539504 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 29 00:35:39 np0005539504 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 00:35:39 np0005539504 kernel: Zone ranges:
Nov 29 00:35:39 np0005539504 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 00:35:39 np0005539504 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 00:35:39 np0005539504 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:39 np0005539504 kernel:  Device   empty
Nov 29 00:35:39 np0005539504 kernel: Movable zone start for each node
Nov 29 00:35:39 np0005539504 kernel: Early memory node ranges
Nov 29 00:35:39 np0005539504 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 00:35:39 np0005539504 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 00:35:39 np0005539504 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:39 np0005539504 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 00:35:39 np0005539504 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 00:35:39 np0005539504 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 00:35:39 np0005539504 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 00:35:39 np0005539504 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 00:35:39 np0005539504 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 00:35:39 np0005539504 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 00:35:39 np0005539504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 00:35:39 np0005539504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 00:35:39 np0005539504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 00:35:39 np0005539504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 00:35:39 np0005539504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 00:35:39 np0005539504 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 00:35:39 np0005539504 kernel: TSC deadline timer available
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Max. logical packages:   8
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Max. logical dies:       8
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Max. dies per package:   1
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Max. threads per core:   1
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Num. cores per package:     1
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Num. threads per package:   1
Nov 29 00:35:39 np0005539504 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 00:35:39 np0005539504 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 00:35:39 np0005539504 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 00:35:39 np0005539504 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 00:35:39 np0005539504 kernel: Booting paravirtualized kernel on KVM
Nov 29 00:35:39 np0005539504 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 00:35:39 np0005539504 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 00:35:39 np0005539504 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 00:35:39 np0005539504 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 00:35:39 np0005539504 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:39 np0005539504 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 00:35:39 np0005539504 kernel: random: crng init done
Nov 29 00:35:39 np0005539504 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: Fallback order for Node 0: 0 
Nov 29 00:35:39 np0005539504 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 00:35:39 np0005539504 kernel: Policy zone: Normal
Nov 29 00:35:39 np0005539504 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 00:35:39 np0005539504 kernel: software IO TLB: area num 8.
Nov 29 00:35:39 np0005539504 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 00:35:39 np0005539504 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 00:35:39 np0005539504 kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 00:35:39 np0005539504 kernel: Dynamic Preempt: voluntary
Nov 29 00:35:39 np0005539504 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 00:35:39 np0005539504 kernel: rcu: #011RCU event tracing is enabled.
Nov 29 00:35:39 np0005539504 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 00:35:39 np0005539504 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 29 00:35:39 np0005539504 kernel: #011Rude variant of Tasks RCU enabled.
Nov 29 00:35:39 np0005539504 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 29 00:35:39 np0005539504 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 00:35:39 np0005539504 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 00:35:39 np0005539504 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:39 np0005539504 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:39 np0005539504 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:39 np0005539504 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 00:35:39 np0005539504 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 00:35:39 np0005539504 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 00:35:39 np0005539504 kernel: Console: colour VGA+ 80x25
Nov 29 00:35:39 np0005539504 kernel: printk: console [ttyS0] enabled
Nov 29 00:35:39 np0005539504 kernel: ACPI: Core revision 20230331
Nov 29 00:35:39 np0005539504 kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 00:35:39 np0005539504 kernel: x2apic enabled
Nov 29 00:35:39 np0005539504 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 00:35:39 np0005539504 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 00:35:39 np0005539504 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 29 00:35:39 np0005539504 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 00:35:39 np0005539504 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 00:35:39 np0005539504 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 00:35:39 np0005539504 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 00:35:39 np0005539504 kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 00:35:39 np0005539504 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 00:35:39 np0005539504 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 00:35:39 np0005539504 kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 00:35:39 np0005539504 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 00:35:39 np0005539504 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 00:35:39 np0005539504 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 00:35:39 np0005539504 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 00:35:39 np0005539504 kernel: x86/bugs: return thunk changed
Nov 29 00:35:39 np0005539504 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 00:35:39 np0005539504 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 00:35:39 np0005539504 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 00:35:39 np0005539504 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 00:35:39 np0005539504 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 00:35:39 np0005539504 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 00:35:39 np0005539504 kernel: Freeing SMP alternatives memory: 40K
Nov 29 00:35:39 np0005539504 kernel: pid_max: default: 32768 minimum: 301
Nov 29 00:35:39 np0005539504 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 00:35:39 np0005539504 kernel: landlock: Up and running.
Nov 29 00:35:39 np0005539504 kernel: Yama: becoming mindful.
Nov 29 00:35:39 np0005539504 kernel: SELinux:  Initializing.
Nov 29 00:35:39 np0005539504 kernel: LSM support for eBPF active
Nov 29 00:35:39 np0005539504 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 00:35:39 np0005539504 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 00:35:39 np0005539504 kernel: ... version:                0
Nov 29 00:35:39 np0005539504 kernel: ... bit width:              48
Nov 29 00:35:39 np0005539504 kernel: ... generic registers:      6
Nov 29 00:35:39 np0005539504 kernel: ... value mask:             0000ffffffffffff
Nov 29 00:35:39 np0005539504 kernel: ... max period:             00007fffffffffff
Nov 29 00:35:39 np0005539504 kernel: ... fixed-purpose events:   0
Nov 29 00:35:39 np0005539504 kernel: ... event mask:             000000000000003f
Nov 29 00:35:39 np0005539504 kernel: signal: max sigframe size: 1776
Nov 29 00:35:39 np0005539504 kernel: rcu: Hierarchical SRCU implementation.
Nov 29 00:35:39 np0005539504 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 29 00:35:39 np0005539504 kernel: smp: Bringing up secondary CPUs ...
Nov 29 00:35:39 np0005539504 kernel: smpboot: x86: Booting SMP configuration:
Nov 29 00:35:39 np0005539504 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 00:35:39 np0005539504 kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 00:35:39 np0005539504 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 29 00:35:39 np0005539504 kernel: node 0 deferred pages initialised in 11ms
Nov 29 00:35:39 np0005539504 kernel: Memory: 7765920K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616272K reserved, 0K cma-reserved)
Nov 29 00:35:39 np0005539504 kernel: devtmpfs: initialized
Nov 29 00:35:39 np0005539504 kernel: x86/mm: Memory block size: 128MB
Nov 29 00:35:39 np0005539504 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 00:35:39 np0005539504 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 00:35:39 np0005539504 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 00:35:39 np0005539504 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 00:35:39 np0005539504 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 00:35:39 np0005539504 kernel: audit: initializing netlink subsys (disabled)
Nov 29 00:35:39 np0005539504 kernel: audit: type=2000 audit(1764394537.381:1): state=initialized audit_enabled=0 res=1
Nov 29 00:35:39 np0005539504 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 00:35:39 np0005539504 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 00:35:39 np0005539504 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 00:35:39 np0005539504 kernel: cpuidle: using governor menu
Nov 29 00:35:39 np0005539504 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 00:35:39 np0005539504 kernel: PCI: Using configuration type 1 for base access
Nov 29 00:35:39 np0005539504 kernel: PCI: Using configuration type 1 for extended access
Nov 29 00:35:39 np0005539504 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 00:35:39 np0005539504 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 00:35:39 np0005539504 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 00:35:39 np0005539504 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 00:35:39 np0005539504 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 00:35:39 np0005539504 kernel: Demotion targets for Node 0: null
Nov 29 00:35:39 np0005539504 kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 00:35:39 np0005539504 kernel: ACPI: Added _OSI(Module Device)
Nov 29 00:35:39 np0005539504 kernel: ACPI: Added _OSI(Processor Device)
Nov 29 00:35:39 np0005539504 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 00:35:39 np0005539504 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 00:35:39 np0005539504 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 00:35:39 np0005539504 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 00:35:39 np0005539504 kernel: ACPI: Interpreter enabled
Nov 29 00:35:39 np0005539504 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 00:35:39 np0005539504 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 00:35:39 np0005539504 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 00:35:39 np0005539504 kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 00:35:39 np0005539504 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 00:35:39 np0005539504 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 00:35:39 np0005539504 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [3] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [4] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [5] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [6] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [7] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [8] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [9] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [10] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [11] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [12] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [13] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [14] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [15] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [16] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [17] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [18] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [19] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [20] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [21] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [22] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [23] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [24] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [25] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [26] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [27] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [28] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [29] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [30] registered
Nov 29 00:35:39 np0005539504 kernel: acpiphp: Slot [31] registered
Nov 29 00:35:39 np0005539504 kernel: PCI host bridge to bus 0000:00
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 00:35:39 np0005539504 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 00:35:39 np0005539504 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 00:35:39 np0005539504 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 00:35:39 np0005539504 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 00:35:39 np0005539504 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 00:35:39 np0005539504 kernel: iommu: Default domain type: Translated
Nov 29 00:35:39 np0005539504 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 00:35:39 np0005539504 kernel: SCSI subsystem initialized
Nov 29 00:35:39 np0005539504 kernel: ACPI: bus type USB registered
Nov 29 00:35:39 np0005539504 kernel: usbcore: registered new interface driver usbfs
Nov 29 00:35:39 np0005539504 kernel: usbcore: registered new interface driver hub
Nov 29 00:35:39 np0005539504 kernel: usbcore: registered new device driver usb
Nov 29 00:35:39 np0005539504 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 00:35:39 np0005539504 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 00:35:39 np0005539504 kernel: PTP clock support registered
Nov 29 00:35:39 np0005539504 kernel: EDAC MC: Ver: 3.0.0
Nov 29 00:35:39 np0005539504 kernel: NetLabel: Initializing
Nov 29 00:35:39 np0005539504 kernel: NetLabel:  domain hash size = 128
Nov 29 00:35:39 np0005539504 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 00:35:39 np0005539504 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 00:35:39 np0005539504 kernel: PCI: Using ACPI for IRQ routing
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 00:35:39 np0005539504 kernel: vgaarb: loaded
Nov 29 00:35:39 np0005539504 kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 00:35:39 np0005539504 kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 00:35:39 np0005539504 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 00:35:39 np0005539504 kernel: pnp: PnP ACPI init
Nov 29 00:35:39 np0005539504 kernel: pnp: PnP ACPI: found 5 devices
Nov 29 00:35:39 np0005539504 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_INET protocol family
Nov 29 00:35:39 np0005539504 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 00:35:39 np0005539504 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_XDP protocol family
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:39 np0005539504 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 00:35:39 np0005539504 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 00:35:39 np0005539504 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 74373 usecs
Nov 29 00:35:39 np0005539504 kernel: PCI: CLS 0 bytes, default 64
Nov 29 00:35:39 np0005539504 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 00:35:39 np0005539504 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 00:35:39 np0005539504 kernel: ACPI: bus type thunderbolt registered
Nov 29 00:35:39 np0005539504 kernel: Trying to unpack rootfs image as initramfs...
Nov 29 00:35:39 np0005539504 kernel: Initialise system trusted keyrings
Nov 29 00:35:39 np0005539504 kernel: Key type blacklist registered
Nov 29 00:35:39 np0005539504 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 00:35:39 np0005539504 kernel: zbud: loaded
Nov 29 00:35:39 np0005539504 kernel: integrity: Platform Keyring initialized
Nov 29 00:35:39 np0005539504 kernel: integrity: Machine keyring initialized
Nov 29 00:35:39 np0005539504 kernel: Freeing initrd memory: 85868K
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_ALG protocol family
Nov 29 00:35:39 np0005539504 kernel: xor: automatically using best checksumming function   avx       
Nov 29 00:35:39 np0005539504 kernel: Key type asymmetric registered
Nov 29 00:35:39 np0005539504 kernel: Asymmetric key parser 'x509' registered
Nov 29 00:35:39 np0005539504 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 00:35:39 np0005539504 kernel: io scheduler mq-deadline registered
Nov 29 00:35:39 np0005539504 kernel: io scheduler kyber registered
Nov 29 00:35:39 np0005539504 kernel: io scheduler bfq registered
Nov 29 00:35:39 np0005539504 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 00:35:39 np0005539504 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 00:35:39 np0005539504 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 00:35:39 np0005539504 kernel: ACPI: button: Power Button [PWRF]
Nov 29 00:35:39 np0005539504 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 00:35:39 np0005539504 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 00:35:39 np0005539504 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 00:35:39 np0005539504 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 00:35:39 np0005539504 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 00:35:39 np0005539504 kernel: Non-volatile memory driver v1.3
Nov 29 00:35:39 np0005539504 kernel: rdac: device handler registered
Nov 29 00:35:39 np0005539504 kernel: hp_sw: device handler registered
Nov 29 00:35:39 np0005539504 kernel: emc: device handler registered
Nov 29 00:35:39 np0005539504 kernel: alua: device handler registered
Nov 29 00:35:39 np0005539504 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 00:35:39 np0005539504 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 00:35:39 np0005539504 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 00:35:39 np0005539504 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 00:35:39 np0005539504 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 00:35:39 np0005539504 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 00:35:39 np0005539504 kernel: usb usb1: Product: UHCI Host Controller
Nov 29 00:35:39 np0005539504 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 00:35:39 np0005539504 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 00:35:39 np0005539504 kernel: hub 1-0:1.0: USB hub found
Nov 29 00:35:39 np0005539504 kernel: hub 1-0:1.0: 2 ports detected
Nov 29 00:35:39 np0005539504 kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 00:35:39 np0005539504 kernel: usbserial: USB Serial support registered for generic
Nov 29 00:35:39 np0005539504 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 00:35:39 np0005539504 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 00:35:39 np0005539504 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 00:35:39 np0005539504 kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 00:35:39 np0005539504 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 00:35:39 np0005539504 kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 00:35:39 np0005539504 kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:38 UTC (1764394538)
Nov 29 00:35:39 np0005539504 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 00:35:39 np0005539504 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 00:35:39 np0005539504 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 00:35:39 np0005539504 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 00:35:39 np0005539504 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 00:35:39 np0005539504 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 00:35:39 np0005539504 kernel: usbcore: registered new interface driver usbhid
Nov 29 00:35:39 np0005539504 kernel: usbhid: USB HID core driver
Nov 29 00:35:39 np0005539504 kernel: drop_monitor: Initializing network drop monitor service
Nov 29 00:35:39 np0005539504 kernel: Initializing XFRM netlink socket
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_INET6 protocol family
Nov 29 00:35:39 np0005539504 kernel: Segment Routing with IPv6
Nov 29 00:35:39 np0005539504 kernel: NET: Registered PF_PACKET protocol family
Nov 29 00:35:39 np0005539504 kernel: mpls_gso: MPLS GSO support
Nov 29 00:35:39 np0005539504 kernel: IPI shorthand broadcast: enabled
Nov 29 00:35:39 np0005539504 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 00:35:39 np0005539504 kernel: AES CTR mode by8 optimization enabled
Nov 29 00:35:39 np0005539504 kernel: sched_clock: Marking stable (1474006358, 148295476)->(1746469347, -124167513)
Nov 29 00:35:39 np0005539504 kernel: registered taskstats version 1
Nov 29 00:35:39 np0005539504 kernel: Loading compiled-in X.509 certificates
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 00:35:39 np0005539504 kernel: Demotion targets for Node 0: null
Nov 29 00:35:39 np0005539504 kernel: page_owner is disabled
Nov 29 00:35:39 np0005539504 kernel: Key type .fscrypt registered
Nov 29 00:35:39 np0005539504 kernel: Key type fscrypt-provisioning registered
Nov 29 00:35:39 np0005539504 kernel: Key type big_key registered
Nov 29 00:35:39 np0005539504 kernel: Key type encrypted registered
Nov 29 00:35:39 np0005539504 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 00:35:39 np0005539504 kernel: Loading compiled-in module X.509 certificates
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:39 np0005539504 kernel: ima: Allocated hash algorithm: sha256
Nov 29 00:35:39 np0005539504 kernel: ima: No architecture policies found
Nov 29 00:35:39 np0005539504 kernel: evm: Initialising EVM extended attributes:
Nov 29 00:35:39 np0005539504 kernel: evm: security.selinux
Nov 29 00:35:39 np0005539504 kernel: evm: security.SMACK64 (disabled)
Nov 29 00:35:39 np0005539504 kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 00:35:39 np0005539504 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 00:35:39 np0005539504 kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 00:35:39 np0005539504 kernel: evm: security.apparmor (disabled)
Nov 29 00:35:39 np0005539504 kernel: evm: security.ima
Nov 29 00:35:39 np0005539504 kernel: evm: security.capability
Nov 29 00:35:39 np0005539504 kernel: evm: HMAC attrs: 0x1
Nov 29 00:35:39 np0005539504 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 00:35:39 np0005539504 kernel: Running certificate verification RSA selftest
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 00:35:39 np0005539504 kernel: Running certificate verification ECDSA selftest
Nov 29 00:35:39 np0005539504 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 00:35:39 np0005539504 kernel: clk: Disabling unused clocks
Nov 29 00:35:39 np0005539504 kernel: Freeing unused decrypted memory: 2028K
Nov 29 00:35:39 np0005539504 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 00:35:39 np0005539504 kernel: Write protecting the kernel read-only data: 30720k
Nov 29 00:35:39 np0005539504 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 00:35:39 np0005539504 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 00:35:39 np0005539504 kernel: Run /init as init process
Nov 29 00:35:39 np0005539504 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 00:35:39 np0005539504 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 00:35:39 np0005539504 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 00:35:39 np0005539504 kernel: usb 1-1: Manufacturer: QEMU
Nov 29 00:35:39 np0005539504 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 00:35:39 np0005539504 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:39 np0005539504 systemd: Detected virtualization kvm.
Nov 29 00:35:39 np0005539504 systemd: Detected architecture x86-64.
Nov 29 00:35:39 np0005539504 systemd: Running in initrd.
Nov 29 00:35:39 np0005539504 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 00:35:39 np0005539504 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 00:35:39 np0005539504 systemd: No hostname configured, using default hostname.
Nov 29 00:35:39 np0005539504 systemd: Hostname set to <localhost>.
Nov 29 00:35:39 np0005539504 systemd: Initializing machine ID from VM UUID.
Nov 29 00:35:39 np0005539504 systemd: Queued start job for default target Initrd Default Target.
Nov 29 00:35:39 np0005539504 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:39 np0005539504 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:39 np0005539504 systemd: Reached target Initrd /usr File System.
Nov 29 00:35:39 np0005539504 systemd: Reached target Local File Systems.
Nov 29 00:35:39 np0005539504 systemd: Reached target Path Units.
Nov 29 00:35:39 np0005539504 systemd: Reached target Slice Units.
Nov 29 00:35:39 np0005539504 systemd: Reached target Swaps.
Nov 29 00:35:39 np0005539504 systemd: Reached target Timer Units.
Nov 29 00:35:39 np0005539504 systemd: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:39 np0005539504 systemd: Listening on Journal Socket (/dev/log).
Nov 29 00:35:39 np0005539504 systemd: Listening on Journal Socket.
Nov 29 00:35:39 np0005539504 systemd: Listening on udev Control Socket.
Nov 29 00:35:39 np0005539504 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:39 np0005539504 systemd: Reached target Socket Units.
Nov 29 00:35:39 np0005539504 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:39 np0005539504 systemd: Starting Journal Service...
Nov 29 00:35:39 np0005539504 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:39 np0005539504 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:39 np0005539504 systemd: Starting Create System Users...
Nov 29 00:35:39 np0005539504 systemd: Starting Setup Virtual Console...
Nov 29 00:35:39 np0005539504 systemd: Finished Create List of Static Device Nodes.
Nov 29 00:35:39 np0005539504 systemd: Finished Apply Kernel Variables.
Nov 29 00:35:39 np0005539504 systemd: Finished Create System Users.
Nov 29 00:35:39 np0005539504 systemd: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:39 np0005539504 systemd-journald[307]: Journal started
Nov 29 00:35:39 np0005539504 systemd-journald[307]: Runtime Journal (/run/log/journal/73921493fa2946fa8f9d6eab83a1506e) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:39 np0005539504 systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 29 00:35:39 np0005539504 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 29 00:35:39 np0005539504 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 00:35:39 np0005539504 systemd: Started Journal Service.
Nov 29 00:35:39 np0005539504 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:39 np0005539504 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:39 np0005539504 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:39 np0005539504 systemd[1]: Finished Setup Virtual Console.
Nov 29 00:35:39 np0005539504 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 00:35:39 np0005539504 systemd[1]: Starting dracut cmdline hook...
Nov 29 00:35:39 np0005539504 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 00:35:39 np0005539504 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:39 np0005539504 systemd[1]: Finished dracut cmdline hook.
Nov 29 00:35:39 np0005539504 systemd[1]: Starting dracut pre-udev hook...
Nov 29 00:35:39 np0005539504 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 00:35:39 np0005539504 kernel: device-mapper: uevent: version 1.0.3
Nov 29 00:35:39 np0005539504 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 00:35:39 np0005539504 kernel: RPC: Registered named UNIX socket transport module.
Nov 29 00:35:39 np0005539504 kernel: RPC: Registered udp transport module.
Nov 29 00:35:39 np0005539504 kernel: RPC: Registered tcp transport module.
Nov 29 00:35:39 np0005539504 kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 00:35:39 np0005539504 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 00:35:39 np0005539504 rpc.statd[443]: Version 2.5.4 starting
Nov 29 00:35:39 np0005539504 rpc.statd[443]: Initializing NSM state
Nov 29 00:35:39 np0005539504 rpc.idmapd[448]: Setting log level to 0
Nov 29 00:35:39 np0005539504 systemd[1]: Finished dracut pre-udev hook.
Nov 29 00:35:39 np0005539504 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:40 np0005539504 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:40 np0005539504 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:40 np0005539504 systemd[1]: Starting dracut pre-trigger hook...
Nov 29 00:35:40 np0005539504 systemd[1]: Finished dracut pre-trigger hook.
Nov 29 00:35:40 np0005539504 systemd[1]: Starting Coldplug All udev Devices...
Nov 29 00:35:40 np0005539504 systemd[1]: Created slice Slice /system/modprobe.
Nov 29 00:35:40 np0005539504 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:40 np0005539504 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:40 np0005539504 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:40 np0005539504 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:40 np0005539504 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target Network.
Nov 29 00:35:40 np0005539504 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:40 np0005539504 systemd[1]: Starting dracut initqueue hook...
Nov 29 00:35:40 np0005539504 systemd[1]: Mounting Kernel Configuration File System...
Nov 29 00:35:40 np0005539504 systemd[1]: Mounted Kernel Configuration File System.
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target System Initialization.
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target Basic System.
Nov 29 00:35:40 np0005539504 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 00:35:40 np0005539504 kernel: scsi host0: ata_piix
Nov 29 00:35:40 np0005539504 kernel: scsi host1: ata_piix
Nov 29 00:35:40 np0005539504 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 00:35:40 np0005539504 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 00:35:40 np0005539504 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 00:35:40 np0005539504 kernel: vda: vda1
Nov 29 00:35:40 np0005539504 kernel: ata1: found unknown device (class 0)
Nov 29 00:35:40 np0005539504 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 00:35:40 np0005539504 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 00:35:40 np0005539504 systemd-udevd[465]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:40 np0005539504 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:40 np0005539504 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 00:35:40 np0005539504 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 00:35:40 np0005539504 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target Initrd Root Device.
Nov 29 00:35:40 np0005539504 systemd[1]: Finished dracut initqueue hook.
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 00:35:40 np0005539504 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:40 np0005539504 systemd[1]: Starting dracut pre-mount hook...
Nov 29 00:35:40 np0005539504 systemd[1]: Finished dracut pre-mount hook.
Nov 29 00:35:40 np0005539504 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 00:35:40 np0005539504 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 00:35:40 np0005539504 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:40 np0005539504 systemd[1]: Mounting /sysroot...
Nov 29 00:35:41 np0005539504 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 00:35:41 np0005539504 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 00:35:41 np0005539504 kernel: XFS (vda1): Ending clean mount
Nov 29 00:35:41 np0005539504 systemd[1]: Mounted /sysroot.
Nov 29 00:35:41 np0005539504 systemd[1]: Reached target Initrd Root File System.
Nov 29 00:35:41 np0005539504 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 00:35:41 np0005539504 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 00:35:41 np0005539504 systemd[1]: Reached target Initrd File Systems.
Nov 29 00:35:41 np0005539504 systemd[1]: Reached target Initrd Default Target.
Nov 29 00:35:41 np0005539504 systemd[1]: Starting dracut mount hook...
Nov 29 00:35:41 np0005539504 systemd[1]: Finished dracut mount hook.
Nov 29 00:35:41 np0005539504 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 00:35:41 np0005539504 rpc.idmapd[448]: exiting on signal 15
Nov 29 00:35:41 np0005539504 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 00:35:41 np0005539504 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Network.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Timer Units.
Nov 29 00:35:41 np0005539504 systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Initrd Default Target.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Basic System.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Initrd Root Device.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Initrd /usr File System.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Path Units.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Remote File Systems.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Slice Units.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Socket Units.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target System Initialization.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Local File Systems.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Swaps.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut mount hook.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut pre-mount hook.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut initqueue hook.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Setup Virtual Console.
Nov 29 00:35:41 np0005539504 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Closed udev Control Socket.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Closed udev Kernel Socket.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut pre-udev hook.
Nov 29 00:35:41 np0005539504 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped dracut cmdline hook.
Nov 29 00:35:41 np0005539504 systemd[1]: Starting Cleanup udev Database...
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 00:35:41 np0005539504 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 00:35:41 np0005539504 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Stopped Create System Users.
Nov 29 00:35:41 np0005539504 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 00:35:41 np0005539504 systemd[1]: Finished Cleanup udev Database.
Nov 29 00:35:41 np0005539504 systemd[1]: Reached target Switch Root.
Nov 29 00:35:41 np0005539504 systemd[1]: Starting Switch Root...
Nov 29 00:35:41 np0005539504 systemd[1]: Switching root.
Nov 29 00:35:41 np0005539504 systemd-journald[307]: Journal stopped
Nov 29 00:35:42 np0005539504 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 29 00:35:42 np0005539504 kernel: audit: type=1404 audit(1764394541.817:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:35:42 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:35:42 np0005539504 kernel: audit: type=1403 audit(1764394541.948:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 00:35:42 np0005539504 systemd: Successfully loaded SELinux policy in 133.977ms.
Nov 29 00:35:42 np0005539504 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.574ms.
Nov 29 00:35:42 np0005539504 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:42 np0005539504 systemd: Detected virtualization kvm.
Nov 29 00:35:42 np0005539504 systemd: Detected architecture x86-64.
Nov 29 00:35:42 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:35:42 np0005539504 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd: Stopped Switch Root.
Nov 29 00:35:42 np0005539504 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 00:35:42 np0005539504 systemd: Created slice Slice /system/getty.
Nov 29 00:35:42 np0005539504 systemd: Created slice Slice /system/serial-getty.
Nov 29 00:35:42 np0005539504 systemd: Created slice Slice /system/sshd-keygen.
Nov 29 00:35:42 np0005539504 systemd: Created slice User and Session Slice.
Nov 29 00:35:42 np0005539504 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:42 np0005539504 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 29 00:35:42 np0005539504 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 00:35:42 np0005539504 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:42 np0005539504 systemd: Stopped target Switch Root.
Nov 29 00:35:42 np0005539504 systemd: Stopped target Initrd File Systems.
Nov 29 00:35:42 np0005539504 systemd: Stopped target Initrd Root File System.
Nov 29 00:35:42 np0005539504 systemd: Reached target Local Integrity Protected Volumes.
Nov 29 00:35:42 np0005539504 systemd: Reached target Path Units.
Nov 29 00:35:42 np0005539504 systemd: Reached target rpc_pipefs.target.
Nov 29 00:35:42 np0005539504 systemd: Reached target Slice Units.
Nov 29 00:35:42 np0005539504 systemd: Reached target Swaps.
Nov 29 00:35:42 np0005539504 systemd: Reached target Local Verity Protected Volumes.
Nov 29 00:35:42 np0005539504 systemd: Listening on RPCbind Server Activation Socket.
Nov 29 00:35:42 np0005539504 systemd: Reached target RPC Port Mapper.
Nov 29 00:35:42 np0005539504 systemd: Listening on Process Core Dump Socket.
Nov 29 00:35:42 np0005539504 systemd: Listening on initctl Compatibility Named Pipe.
Nov 29 00:35:42 np0005539504 systemd: Listening on udev Control Socket.
Nov 29 00:35:42 np0005539504 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:42 np0005539504 systemd: Mounting Huge Pages File System...
Nov 29 00:35:42 np0005539504 systemd: Mounting POSIX Message Queue File System...
Nov 29 00:35:42 np0005539504 systemd: Mounting Kernel Debug File System...
Nov 29 00:35:42 np0005539504 systemd: Mounting Kernel Trace File System...
Nov 29 00:35:42 np0005539504 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:42 np0005539504 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:42 np0005539504 systemd: Starting Load Kernel Module configfs...
Nov 29 00:35:42 np0005539504 systemd: Starting Load Kernel Module drm...
Nov 29 00:35:42 np0005539504 systemd: Starting Load Kernel Module efi_pstore...
Nov 29 00:35:42 np0005539504 systemd: Starting Load Kernel Module fuse...
Nov 29 00:35:42 np0005539504 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 00:35:42 np0005539504 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd: Stopped File System Check on Root Device.
Nov 29 00:35:42 np0005539504 systemd: Stopped Journal Service.
Nov 29 00:35:42 np0005539504 systemd: Starting Journal Service...
Nov 29 00:35:42 np0005539504 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:42 np0005539504 systemd: Starting Generate network units from Kernel command line...
Nov 29 00:35:42 np0005539504 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:42 np0005539504 systemd: Starting Remount Root and Kernel File Systems...
Nov 29 00:35:42 np0005539504 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 00:35:42 np0005539504 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:42 np0005539504 systemd: Starting Coldplug All udev Devices...
Nov 29 00:35:42 np0005539504 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 00:35:42 np0005539504 kernel: fuse: init (API version 7.37)
Nov 29 00:35:42 np0005539504 systemd: Mounted Huge Pages File System.
Nov 29 00:35:42 np0005539504 systemd: Mounted POSIX Message Queue File System.
Nov 29 00:35:42 np0005539504 systemd: Mounted Kernel Debug File System.
Nov 29 00:35:42 np0005539504 systemd: Mounted Kernel Trace File System.
Nov 29 00:35:42 np0005539504 systemd-journald[676]: Journal started
Nov 29 00:35:42 np0005539504 systemd-journald[676]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:42 np0005539504 systemd[1]: Queued start job for default target Multi-User System.
Nov 29 00:35:42 np0005539504 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd: Started Journal Service.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 00:35:42 np0005539504 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:42 np0005539504 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 00:35:42 np0005539504 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Load Kernel Module fuse.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Apply Kernel Variables.
Nov 29 00:35:42 np0005539504 kernel: ACPI: bus type drm_connector registered
Nov 29 00:35:42 np0005539504 systemd[1]: Mounting FUSE Control File System...
Nov 29 00:35:42 np0005539504 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Rebuild Hardware Database...
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 00:35:42 np0005539504 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Create System Users...
Nov 29 00:35:42 np0005539504 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Load Kernel Module drm.
Nov 29 00:35:42 np0005539504 systemd[1]: Mounted FUSE Control File System.
Nov 29 00:35:42 np0005539504 systemd-journald[676]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:42 np0005539504 systemd-journald[676]: Received client request to flush runtime journal.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Create System Users.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 00:35:42 np0005539504 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:42 np0005539504 systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 00:35:42 np0005539504 systemd[1]: Reached target Local File Systems.
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 00:35:42 np0005539504 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 00:35:42 np0005539504 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 00:35:42 np0005539504 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 00:35:42 np0005539504 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 00:35:42 np0005539504 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:42 np0005539504 bootctl[694]: Couldn't find EFI system partition, skipping.
Nov 29 00:35:42 np0005539504 systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Security Auditing Service...
Nov 29 00:35:43 np0005539504 systemd[1]: Starting RPC Bind...
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 00:35:43 np0005539504 auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 00:35:43 np0005539504 auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 00:35:43 np0005539504 systemd[1]: Started RPC Bind.
Nov 29 00:35:43 np0005539504 augenrules[705]: /sbin/augenrules: No change
Nov 29 00:35:43 np0005539504 augenrules[720]: No rules
Nov 29 00:35:43 np0005539504 augenrules[720]: enabled 1
Nov 29 00:35:43 np0005539504 augenrules[720]: failure 1
Nov 29 00:35:43 np0005539504 augenrules[720]: pid 700
Nov 29 00:35:43 np0005539504 augenrules[720]: rate_limit 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_limit 8192
Nov 29 00:35:43 np0005539504 augenrules[720]: lost 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:43 np0005539504 augenrules[720]: enabled 1
Nov 29 00:35:43 np0005539504 augenrules[720]: failure 1
Nov 29 00:35:43 np0005539504 augenrules[720]: pid 700
Nov 29 00:35:43 np0005539504 augenrules[720]: rate_limit 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_limit 8192
Nov 29 00:35:43 np0005539504 augenrules[720]: lost 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog 1
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:43 np0005539504 augenrules[720]: enabled 1
Nov 29 00:35:43 np0005539504 augenrules[720]: failure 1
Nov 29 00:35:43 np0005539504 augenrules[720]: pid 700
Nov 29 00:35:43 np0005539504 augenrules[720]: rate_limit 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_limit 8192
Nov 29 00:35:43 np0005539504 augenrules[720]: lost 0
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog 3
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:43 np0005539504 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:43 np0005539504 systemd[1]: Started Security Auditing Service.
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Rebuild Hardware Database.
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Update is Completed...
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Update is Completed.
Nov 29 00:35:43 np0005539504 systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:43 np0005539504 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:43 np0005539504 systemd[1]: Reached target System Initialization.
Nov 29 00:35:43 np0005539504 systemd[1]: Started dnf makecache --timer.
Nov 29 00:35:43 np0005539504 systemd[1]: Started Daily rotation of log files.
Nov 29 00:35:43 np0005539504 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 00:35:43 np0005539504 systemd[1]: Reached target Timer Units.
Nov 29 00:35:43 np0005539504 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:43 np0005539504 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 00:35:43 np0005539504 systemd[1]: Reached target Socket Units.
Nov 29 00:35:43 np0005539504 systemd[1]: Starting D-Bus System Message Bus...
Nov 29 00:35:43 np0005539504 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:43 np0005539504 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:43 np0005539504 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:43 np0005539504 systemd-udevd[749]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:43 np0005539504 systemd[1]: Started D-Bus System Message Bus.
Nov 29 00:35:43 np0005539504 systemd[1]: Reached target Basic System.
Nov 29 00:35:43 np0005539504 dbus-broker-lau[760]: Ready
Nov 29 00:35:43 np0005539504 systemd[1]: Starting NTP client/server...
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 00:35:43 np0005539504 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 00:35:43 np0005539504 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 00:35:43 np0005539504 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 00:35:43 np0005539504 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 00:35:43 np0005539504 systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 00:35:43 np0005539504 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 00:35:43 np0005539504 systemd[1]: Started irqbalance daemon.
Nov 29 00:35:43 np0005539504 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 00:35:43 np0005539504 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:43 np0005539504 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:43 np0005539504 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:43 np0005539504 systemd[1]: Reached target sshd-keygen.target.
Nov 29 00:35:43 np0005539504 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 00:35:43 np0005539504 systemd[1]: Reached target User and Group Name Lookups.
Nov 29 00:35:43 np0005539504 systemd[1]: Starting User Login Management...
Nov 29 00:35:43 np0005539504 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 00:35:43 np0005539504 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 00:35:43 np0005539504 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 00:35:43 np0005539504 kernel: Console: switching to colour dummy device 80x25
Nov 29 00:35:43 np0005539504 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 00:35:43 np0005539504 kernel: [drm] features: -context_init
Nov 29 00:35:43 np0005539504 kernel: [drm] number of scanouts: 1
Nov 29 00:35:43 np0005539504 kernel: [drm] number of cap sets: 0
Nov 29 00:35:43 np0005539504 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 00:35:43 np0005539504 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 00:35:43 np0005539504 kernel: Console: switching to colour frame buffer device 128x48
Nov 29 00:35:43 np0005539504 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 00:35:43 np0005539504 chronyd[799]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 00:35:43 np0005539504 chronyd[799]: Loaded 0 symmetric keys
Nov 29 00:35:43 np0005539504 chronyd[799]: Using right/UTC timezone to obtain leap second data
Nov 29 00:35:43 np0005539504 chronyd[799]: Loaded seccomp filter (level 2)
Nov 29 00:35:43 np0005539504 systemd[1]: Started NTP client/server.
Nov 29 00:35:43 np0005539504 systemd-logind[783]: New seat seat0.
Nov 29 00:35:43 np0005539504 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 00:35:43 np0005539504 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 00:35:43 np0005539504 systemd[1]: Started User Login Management.
Nov 29 00:35:43 np0005539504 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 00:35:43 np0005539504 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 00:35:43 np0005539504 kernel: kvm_amd: TSC scaling supported
Nov 29 00:35:43 np0005539504 kernel: kvm_amd: Nested Virtualization enabled
Nov 29 00:35:43 np0005539504 kernel: kvm_amd: Nested Paging enabled
Nov 29 00:35:43 np0005539504 kernel: kvm_amd: LBR virtualization supported
Nov 29 00:35:44 np0005539504 iptables.init[777]: iptables: Applying firewall rules: [  OK  ]
Nov 29 00:35:44 np0005539504 systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 00:35:44 np0005539504 cloud-init[837]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:44 +0000. Up 7.09 seconds.
Nov 29 00:35:44 np0005539504 systemd[1]: run-cloud\x2dinit-tmp-tmpnzn2ene9.mount: Deactivated successfully.
Nov 29 00:35:44 np0005539504 systemd[1]: Starting Hostname Service...
Nov 29 00:35:44 np0005539504 systemd[1]: Started Hostname Service.
Nov 29 00:35:44 np0005539504 systemd-hostnamed[851]: Hostname set to <np0005539504.novalocal> (static)
Nov 29 00:35:44 np0005539504 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 00:35:44 np0005539504 systemd[1]: Reached target Preparation for Network.
Nov 29 00:35:44 np0005539504 systemd[1]: Starting Network Manager...
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.7769] NetworkManager (version 1.54.1-1.el9) is starting... (boot:62dd3eed-5b38-4c74-8c8b-95416b2c294d)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.7774] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.7839] manager[0x55fcf44da080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.7885] hostname: hostname: using hostnamed
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.7885] hostname: static hostname changed from (none) to "np0005539504.novalocal"
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.7889] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8045] manager[0x55fcf44da080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8045] manager[0x55fcf44da080]: rfkill: WWAN hardware radio set enabled
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8082] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8083] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8083] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8084] manager: Networking is enabled by state file
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8086] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8097] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8116] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8129] dhcp: init: Using DHCP client 'internal'
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8131] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8145] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8153] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:44 np0005539504 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8160] device (lo): Activation: starting connection 'lo' (90b43410-4648-4f39-847b-37821e0dfc83)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8170] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8174] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8212] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8216] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8218] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8220] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8222] device (eth0): carrier: link connected
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8226] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8233] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8238] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8242] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8243] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8245] manager: NetworkManager state is now CONNECTING
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8247] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8254] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8257] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8292] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8300] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8318] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:35:44 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:35:44 np0005539504 systemd[1]: Started Network Manager.
Nov 29 00:35:44 np0005539504 systemd[1]: Reached target Network.
Nov 29 00:35:44 np0005539504 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:35:44 np0005539504 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 00:35:44 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8586] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8589] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8590] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8596] device (lo): Activation: successful, device activated.
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8601] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8605] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8607] device (eth0): Activation: successful, device activated.
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8618] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:35:44 np0005539504 NetworkManager[855]: <info>  [1764394544.8620] manager: startup complete
Nov 29 00:35:44 np0005539504 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 00:35:44 np0005539504 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:44 np0005539504 systemd[1]: Reached target NFS client services.
Nov 29 00:35:44 np0005539504 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:44 np0005539504 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:44 np0005539504 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:44 np0005539504 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:35:44 np0005539504 systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 00:35:45 np0005539504 cloud-init[919]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:35:45 +0000. Up 8.10 seconds.
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |  eth0  | True |        38.102.83.241         | 255.255.255.0 | global | fa:16:3e:a7:26:b4 |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fea7:26b4/64 |       .       |  link  | fa:16:3e:a7:26:b4 |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 00:35:45 np0005539504 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:49 np0005539504 chronyd[799]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 00:35:49 np0005539504 chronyd[799]: System clock TAI offset set to 37 seconds
Nov 29 00:35:54 np0005539504 irqbalance[778]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 29 00:35:54 np0005539504 irqbalance[778]: IRQ 25 affinity is now unmanaged
Nov 29 00:35:54 np0005539504 irqbalance[778]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 00:35:54 np0005539504 irqbalance[778]: IRQ 31 affinity is now unmanaged
Nov 29 00:35:54 np0005539504 irqbalance[778]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 29 00:35:54 np0005539504 irqbalance[778]: IRQ 28 affinity is now unmanaged
Nov 29 00:35:54 np0005539504 irqbalance[778]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 29 00:35:54 np0005539504 irqbalance[778]: IRQ 32 affinity is now unmanaged
Nov 29 00:35:54 np0005539504 irqbalance[778]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 00:35:54 np0005539504 irqbalance[778]: IRQ 30 affinity is now unmanaged
Nov 29 00:35:54 np0005539504 irqbalance[778]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 00:35:54 np0005539504 irqbalance[778]: IRQ 29 affinity is now unmanaged
Nov 29 00:35:55 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:36:14 np0005539504 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:37:25 np0005539504 cloud-init[919]: Generating public/private rsa key pair.
Nov 29 00:37:25 np0005539504 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 00:37:25 np0005539504 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 00:37:25 np0005539504 cloud-init[919]: The key fingerprint is:
Nov 29 00:37:25 np0005539504 cloud-init[919]: SHA256:42tfMeENx3sYnFIkm58ejl3g9+JYMFV78+FZNvM8abI root@np0005539504.novalocal
Nov 29 00:37:25 np0005539504 cloud-init[919]: The key's randomart image is:
Nov 29 00:37:25 np0005539504 cloud-init[919]: +---[RSA 3072]----+
Nov 29 00:37:25 np0005539504 cloud-init[919]: |            ..o .|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |             B o.|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |            * O*+|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |           . X.O%|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |        S   *.O*B|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |       . .   X+=o|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |        .   oE* .|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |        .. . + . |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |       .... . .  |
Nov 29 00:37:25 np0005539504 cloud-init[919]: +----[SHA256]-----+
Nov 29 00:37:25 np0005539504 cloud-init[919]: Generating public/private ecdsa key pair.
Nov 29 00:37:25 np0005539504 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 00:37:25 np0005539504 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 00:37:25 np0005539504 cloud-init[919]: The key fingerprint is:
Nov 29 00:37:25 np0005539504 cloud-init[919]: SHA256:g6t5fBnm21qaRjl53yVntWar0HmmXzHiYh2E6wKztAU root@np0005539504.novalocal
Nov 29 00:37:25 np0005539504 cloud-init[919]: The key's randomart image is:
Nov 29 00:37:25 np0005539504 cloud-init[919]: +---[ECDSA 256]---+
Nov 29 00:37:25 np0005539504 cloud-init[919]: |                 |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |            .    |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |       E   . .   |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |       ..   o   .|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |      .+S+ . o oo|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |      ..&.o + =+*|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |     ..* *.* *oBo|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |     oo ==o + =..|
Nov 29 00:37:25 np0005539504 cloud-init[919]: |    o. o=o.  oo. |
Nov 29 00:37:25 np0005539504 cloud-init[919]: +----[SHA256]-----+
Nov 29 00:37:25 np0005539504 cloud-init[919]: Generating public/private ed25519 key pair.
Nov 29 00:37:25 np0005539504 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 00:37:25 np0005539504 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 00:37:25 np0005539504 cloud-init[919]: The key fingerprint is:
Nov 29 00:37:25 np0005539504 cloud-init[919]: SHA256:Bwi+yMN6typfLHoMstldXkJDkldBiuHEO8U88DJOaU0 root@np0005539504.novalocal
Nov 29 00:37:25 np0005539504 cloud-init[919]: The key's randomart image is:
Nov 29 00:37:25 np0005539504 cloud-init[919]: +--[ED25519 256]--+
Nov 29 00:37:25 np0005539504 cloud-init[919]: |   .==Eo+.       |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |   ++OBo         |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |    X*=..        |
Nov 29 00:37:25 np0005539504 cloud-init[919]: | o =o+o  .       |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |  = oo .S .      |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |o. o  o ..       |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |o*o.+o o         |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |+o++...          |
Nov 29 00:37:25 np0005539504 cloud-init[919]: |.+o..            |
Nov 29 00:37:25 np0005539504 cloud-init[919]: +----[SHA256]-----+
Nov 29 00:37:25 np0005539504 systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 00:37:25 np0005539504 systemd[1]: Reached target Cloud-config availability.
Nov 29 00:37:25 np0005539504 systemd[1]: Reached target Network is Online.
Nov 29 00:37:25 np0005539504 systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 00:37:25 np0005539504 systemd[1]: Starting Crash recovery kernel arming...
Nov 29 00:37:25 np0005539504 systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 00:37:25 np0005539504 systemd[1]: Starting System Logging Service...
Nov 29 00:37:25 np0005539504 systemd[1]: Starting OpenSSH server daemon...
Nov 29 00:37:25 np0005539504 sm-notify[1006]: Version 2.5.4 starting
Nov 29 00:37:25 np0005539504 systemd[1]: Starting Permit User Sessions...
Nov 29 00:37:25 np0005539504 systemd[1]: Started Notify NFS peers of a restart.
Nov 29 00:37:25 np0005539504 systemd[1]: Started OpenSSH server daemon.
Nov 29 00:37:25 np0005539504 systemd[1]: Finished Permit User Sessions.
Nov 29 00:37:25 np0005539504 rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Nov 29 00:37:25 np0005539504 rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 00:37:25 np0005539504 systemd[1]: Started Command Scheduler.
Nov 29 00:37:25 np0005539504 systemd[1]: Started Getty on tty1.
Nov 29 00:37:25 np0005539504 systemd[1]: Started Serial Getty on ttyS0.
Nov 29 00:37:25 np0005539504 systemd[1]: Reached target Login Prompts.
Nov 29 00:37:25 np0005539504 systemd[1]: Started System Logging Service.
Nov 29 00:37:25 np0005539504 systemd[1]: Reached target Multi-User System.
Nov 29 00:37:25 np0005539504 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 00:37:25 np0005539504 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 00:37:25 np0005539504 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 00:37:25 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 00:37:25 np0005539504 kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Nov 29 00:37:25 np0005539504 kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 00:37:26 np0005539504 cloud-init[1158]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:37:26 +0000. Up 108.96 seconds.
Nov 29 00:37:26 np0005539504 systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 00:37:26 np0005539504 systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 00:37:26 np0005539504 dracut[1286]: dracut-057-102.git20250818.el9
Nov 29 00:37:26 np0005539504 cloud-init[1304]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:37:26 +0000. Up 109.36 seconds.
Nov 29 00:37:26 np0005539504 cloud-init[1310]: #############################################################
Nov 29 00:37:26 np0005539504 cloud-init[1313]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 00:37:26 np0005539504 cloud-init[1320]: 256 SHA256:g6t5fBnm21qaRjl53yVntWar0HmmXzHiYh2E6wKztAU root@np0005539504.novalocal (ECDSA)
Nov 29 00:37:26 np0005539504 dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 00:37:26 np0005539504 cloud-init[1327]: 256 SHA256:Bwi+yMN6typfLHoMstldXkJDkldBiuHEO8U88DJOaU0 root@np0005539504.novalocal (ED25519)
Nov 29 00:37:26 np0005539504 cloud-init[1333]: 3072 SHA256:42tfMeENx3sYnFIkm58ejl3g9+JYMFV78+FZNvM8abI root@np0005539504.novalocal (RSA)
Nov 29 00:37:26 np0005539504 cloud-init[1334]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 00:37:26 np0005539504 cloud-init[1336]: #############################################################
Nov 29 00:37:26 np0005539504 cloud-init[1304]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:37:26 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 109.53 seconds
Nov 29 00:37:26 np0005539504 systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 00:37:26 np0005539504 systemd[1]: Reached target Cloud-init target.
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: memstrack is not available
Nov 29 00:37:27 np0005539504 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:37:27 np0005539504 dracut[1288]: memstrack is not available
Nov 29 00:37:27 np0005539504 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:37:28 np0005539504 dracut[1288]: *** Including module: systemd ***
Nov 29 00:37:28 np0005539504 dracut[1288]: *** Including module: fips ***
Nov 29 00:37:28 np0005539504 dracut[1288]: *** Including module: systemd-initrd ***
Nov 29 00:37:28 np0005539504 dracut[1288]: *** Including module: i18n ***
Nov 29 00:37:28 np0005539504 dracut[1288]: *** Including module: drm ***
Nov 29 00:37:29 np0005539504 dracut[1288]: *** Including module: prefixdevname ***
Nov 29 00:37:29 np0005539504 dracut[1288]: *** Including module: kernel-modules ***
Nov 29 00:37:29 np0005539504 kernel: block vda: the capability attribute has been deprecated.
Nov 29 00:37:29 np0005539504 dracut[1288]: *** Including module: kernel-modules-extra ***
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: qemu ***
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: fstab-sys ***
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: rootfs-block ***
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: terminfo ***
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: udev-rules ***
Nov 29 00:37:30 np0005539504 dracut[1288]: Skipping udev rule: 91-permissions.rules
Nov 29 00:37:30 np0005539504 dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: virtiofs ***
Nov 29 00:37:30 np0005539504 dracut[1288]: *** Including module: dracut-systemd ***
Nov 29 00:37:31 np0005539504 dracut[1288]: *** Including module: usrmount ***
Nov 29 00:37:31 np0005539504 dracut[1288]: *** Including module: base ***
Nov 29 00:37:31 np0005539504 dracut[1288]: *** Including module: fs-lib ***
Nov 29 00:37:31 np0005539504 dracut[1288]: *** Including module: kdumpbase ***
Nov 29 00:37:31 np0005539504 dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 00:37:31 np0005539504 dracut[1288]:  microcode_ctl module: mangling fw_dir
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel" is ignored
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 00:37:31 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 00:37:32 np0005539504 dracut[1288]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 00:37:32 np0005539504 dracut[1288]: *** Including module: openssl ***
Nov 29 00:37:32 np0005539504 dracut[1288]: *** Including module: shutdown ***
Nov 29 00:37:32 np0005539504 dracut[1288]: *** Including module: squash ***
Nov 29 00:37:32 np0005539504 dracut[1288]: *** Including modules done ***
Nov 29 00:37:32 np0005539504 dracut[1288]: *** Installing kernel module dependencies ***
Nov 29 00:37:33 np0005539504 dracut[1288]: *** Installing kernel module dependencies done ***
Nov 29 00:37:33 np0005539504 dracut[1288]: *** Resolving executable dependencies ***
Nov 29 00:37:34 np0005539504 dracut[1288]: *** Resolving executable dependencies done ***
Nov 29 00:37:34 np0005539504 dracut[1288]: *** Generating early-microcode cpio image ***
Nov 29 00:37:34 np0005539504 dracut[1288]: *** Store current command line parameters ***
Nov 29 00:37:34 np0005539504 dracut[1288]: Stored kernel commandline:
Nov 29 00:37:34 np0005539504 dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Nov 29 00:37:34 np0005539504 dracut[1288]: *** Install squash loader ***
Nov 29 00:37:35 np0005539504 dracut[1288]: *** Squashing the files inside the initramfs ***
Nov 29 00:37:36 np0005539504 dracut[1288]: *** Squashing the files inside the initramfs done ***
Nov 29 00:37:36 np0005539504 dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 00:37:36 np0005539504 dracut[1288]: *** Hardlinking files ***
Nov 29 00:37:36 np0005539504 dracut[1288]: *** Hardlinking files done ***
Nov 29 00:37:37 np0005539504 dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 00:37:38 np0005539504 kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Nov 29 00:37:38 np0005539504 kdumpctl[1014]: kdump: Starting kdump: [OK]
Nov 29 00:37:38 np0005539504 systemd[1]: Finished Crash recovery kernel arming.
Nov 29 00:37:38 np0005539504 systemd[1]: Startup finished in 1.883s (kernel) + 2.866s (initrd) + 1min 56.221s (userspace) = 2min 972ms.
Nov 29 00:46:55 np0005539504 systemd[1]: Created slice User Slice of UID 1000.
Nov 29 00:46:55 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 00:46:55 np0005539504 systemd-logind[783]: New session 1 of user zuul.
Nov 29 00:46:55 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 00:46:55 np0005539504 systemd[1]: Starting User Manager for UID 1000...
Nov 29 00:46:55 np0005539504 systemd[4305]: Queued start job for default target Main User Target.
Nov 29 00:46:55 np0005539504 systemd[4305]: Created slice User Application Slice.
Nov 29 00:46:55 np0005539504 systemd[4305]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 00:46:55 np0005539504 systemd[4305]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 00:46:55 np0005539504 systemd[4305]: Reached target Paths.
Nov 29 00:46:55 np0005539504 systemd[4305]: Reached target Timers.
Nov 29 00:46:55 np0005539504 systemd[4305]: Starting D-Bus User Message Bus Socket...
Nov 29 00:46:55 np0005539504 systemd[4305]: Starting Create User's Volatile Files and Directories...
Nov 29 00:46:55 np0005539504 systemd[4305]: Listening on D-Bus User Message Bus Socket.
Nov 29 00:46:55 np0005539504 systemd[4305]: Reached target Sockets.
Nov 29 00:46:55 np0005539504 systemd[4305]: Finished Create User's Volatile Files and Directories.
Nov 29 00:46:55 np0005539504 systemd[4305]: Reached target Basic System.
Nov 29 00:46:55 np0005539504 systemd[4305]: Reached target Main User Target.
Nov 29 00:46:55 np0005539504 systemd[4305]: Startup finished in 171ms.
Nov 29 00:46:55 np0005539504 systemd[1]: Started User Manager for UID 1000.
Nov 29 00:46:55 np0005539504 systemd[1]: Started Session 1 of User zuul.
Nov 29 00:46:55 np0005539504 python3[4389]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:00 np0005539504 python3[4417]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:07 np0005539504 python3[4475]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:08 np0005539504 python3[4515]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 00:47:10 np0005539504 python3[4541]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDgFkjZfiFEmT2Jql9lLFt6CMd+9slSl3MrU+Raer5Y68zzzczsYHXSYgggBZM5uz+gWk02zu4ocSLCc0JOe4EmLwZGL6Ezoic8MmIXP1BwfiaeAXto2OGK7Dc7os16Q0SND6rHgOqdWZh8Kyf2kkY5vrdl9/yfrpAOV4V0UE16RT1qCQW53Ky9IytfIZYMSXaZwSmcvRflB6YToX0wepfVb3xbVWsEBI209yBpJ9cNVY5dWwvu1IlNXbIGLhUr4j3UgrB2k+H2+ltPlEHfLXPB0E2e43vS9K00XtLqpM4JZoq24L0kLi1a3RwzEeG1NQhkGbdnesYTkGRJrh5LvfWLiF4tooJWI0nRVs7jaO/R3w1l7zjdLRrSJ0h7Ie09iYSVZ1nuUuZ77A8mwh/mgdp8FEle4ES1X0kEADcAPPXV/6wFLOHevKRKw+jWBtYusFM6hS74njbD8BM8P0xMUAgCMIw7t3AXjeZIFNjZLL1o2fplfERitOr2Mc7dMx1EvfM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:10 np0005539504 python3[4565]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:11 np0005539504 python3[4664]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:11 np0005539504 python3[4735]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395231.0615566-252-192873891019547/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=913550f2748a46ca85451ad1c4228192_id_rsa follow=False checksum=527bb20bedeb4c076b14aeb265edb174c4d8c41f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:12 np0005539504 python3[4858]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:12 np0005539504 python3[4929]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395231.9890375-307-109694041208852/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=913550f2748a46ca85451ad1c4228192_id_rsa.pub follow=False checksum=56c975ef54c9fc5ba54f09c3deb1770b074b7446 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:13 np0005539504 python3[4977]: ansible-ping Invoked with data=pong
Nov 29 00:47:15 np0005539504 python3[5001]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:17 np0005539504 python3[5059]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 00:47:18 np0005539504 python3[5091]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:18 np0005539504 python3[5115]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539504 python3[5139]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539504 python3[5163]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539504 python3[5187]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539504 python3[5211]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:21 np0005539504 python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:22 np0005539504 python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:22 np0005539504 python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395241.6818542-32-188069900822807/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:23 np0005539504 python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:23 np0005539504 python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:23 np0005539504 python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539504 python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539504 python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539504 python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539504 python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:25 np0005539504 python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:25 np0005539504 python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:25 np0005539504 python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539504 python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539504 python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539504 python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539504 python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539504 python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539504 python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539504 python3[5821]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539504 python3[5845]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539504 python3[5869]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539504 python3[5893]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539504 python3[5917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539504 python3[5941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539504 python3[5965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539504 python3[5989]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:30 np0005539504 python3[6013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:30 np0005539504 python3[6037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:33 np0005539504 python3[6063]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 00:47:33 np0005539504 systemd[1]: Starting Time & Date Service...
Nov 29 00:47:33 np0005539504 systemd[1]: Started Time & Date Service.
Nov 29 00:47:33 np0005539504 systemd-timedated[6065]: Changed time zone to 'UTC' (UTC).
Nov 29 00:47:33 np0005539504 python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:34 np0005539504 python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:34 np0005539504 python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764395253.8448846-252-27172972104786/source _original_basename=tmpaagw9bc5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:35 np0005539504 python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:35 np0005539504 python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764395254.7334032-303-17471210725/source _original_basename=tmpv877sgks follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:36 np0005539504 python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:36 np0005539504 python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764395255.9310777-382-166103698208283/source _original_basename=tmpcwnuxgbl follow=False checksum=9afea3fa7e450257b25577284f0f4f0dfca88d28 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:37 np0005539504 python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:37 np0005539504 python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:37 np0005539504 python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:38 np0005539504 python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395257.625505-452-216047334082241/source _original_basename=tmpdgnim_h3 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:38 np0005539504 python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-d89d-f1f2-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:39 np0005539504 python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d89d-f1f2-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 00:47:41 np0005539504 python3[6921]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:00 np0005539504 python3[6948]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:03 np0005539504 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 00:49:00 np0005539504 systemd-logind[783]: Session 1 logged out. Waiting for processes to exit.
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 00:49:02 np0005539504 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 00:49:02 np0005539504 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.7731] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:49:02 np0005539504 systemd-udevd[6952]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.7976] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8019] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8026] device (eth1): carrier: link connected
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8029] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8037] policy: auto-activating connection 'Wired connection 1' (08b8e8f2-c9d1-3cc8-8887-49c6b7206b68)
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8042] device (eth1): Activation: starting connection 'Wired connection 1' (08b8e8f2-c9d1-3cc8-8887-49c6b7206b68)
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8044] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8049] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8054] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:49:02 np0005539504 NetworkManager[855]: <info>  [1764395342.8060] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:02 np0005539504 systemd[4305]: Starting Mark boot as successful...
Nov 29 00:49:02 np0005539504 systemd[4305]: Finished Mark boot as successful.
Nov 29 00:49:03 np0005539504 systemd-logind[783]: New session 3 of user zuul.
Nov 29 00:49:03 np0005539504 systemd[1]: Started Session 3 of User zuul.
Nov 29 00:49:04 np0005539504 python3[6984]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-fab1-3db9-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:49:11 np0005539504 python3[7065]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:49:11 np0005539504 python3[7138]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395350.7769473-155-183382367629098/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=dea22bf923e04b12a706e3d2a4b83709e22d6e49 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:49:12 np0005539504 python3[7188]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 00:49:12 np0005539504 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 00:49:12 np0005539504 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 00:49:12 np0005539504 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 00:49:12 np0005539504 systemd[1]: Stopping Network Manager...
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.0944] caught SIGTERM, shutting down normally.
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.0958] dhcp4 (eth0): canceled DHCP transaction
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.0958] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.0958] dhcp4 (eth0): state changed no lease
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.0962] manager: NetworkManager state is now CONNECTING
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.1114] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.1115] dhcp4 (eth1): state changed no lease
Nov 29 00:49:12 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:49:12 np0005539504 NetworkManager[855]: <info>  [1764395352.1209] exiting (success)
Nov 29 00:49:12 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:49:12 np0005539504 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 00:49:12 np0005539504 systemd[1]: Stopped Network Manager.
Nov 29 00:49:12 np0005539504 systemd[1]: NetworkManager.service: Consumed 5.861s CPU time, 10.0M memory peak.
Nov 29 00:49:12 np0005539504 systemd[1]: Starting Network Manager...
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.1986] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:62dd3eed-5b38-4c74-8c8b-95416b2c294d)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.1989] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.2058] manager[0x558d2517a070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:49:12 np0005539504 systemd[1]: Starting Hostname Service...
Nov 29 00:49:12 np0005539504 systemd[1]: Started Hostname Service.
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3195] hostname: hostname: using hostnamed
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3195] hostname: static hostname changed from (none) to "np0005539504.novalocal"
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3205] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3211] manager[0x558d2517a070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3212] manager[0x558d2517a070]: rfkill: WWAN hardware radio set enabled
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3260] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3260] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3261] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3262] manager: Networking is enabled by state file
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3266] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3272] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3321] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3341] dhcp: init: Using DHCP client 'internal'
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3348] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3359] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3370] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3385] device (lo): Activation: starting connection 'lo' (90b43410-4648-4f39-847b-37821e0dfc83)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3397] device (eth0): carrier: link connected
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3406] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3416] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3418] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3432] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3449] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3461] device (eth1): carrier: link connected
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3470] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3484] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (08b8e8f2-c9d1-3cc8-8887-49c6b7206b68) (indicated)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3486] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3500] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3517] device (eth1): Activation: starting connection 'Wired connection 1' (08b8e8f2-c9d1-3cc8-8887-49c6b7206b68)
Nov 29 00:49:12 np0005539504 systemd[1]: Started Network Manager.
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3529] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3537] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3547] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3552] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3557] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3566] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3587] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3592] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3601] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3626] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3634] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3648] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3652] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3677] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3685] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3695] device (lo): Activation: successful, device activated.
Nov 29 00:49:12 np0005539504 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3715] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3727] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3829] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3858] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3860] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3866] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3873] device (eth0): Activation: successful, device activated.
Nov 29 00:49:12 np0005539504 NetworkManager[7197]: <info>  [1764395352.3882] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:49:12 np0005539504 python3[7273]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-fab1-3db9-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:49:22 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:49:42 np0005539504 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.0662] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:49:58 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:49:58 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1031] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1034] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1043] device (eth1): Activation: successful, device activated.
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1049] manager: startup complete
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1051] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <warn>  [1764395398.1056] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1064] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1174] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1174] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1175] dhcp4 (eth1): state changed no lease
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1188] policy: auto-activating connection 'ci-private-network' (dd13faa5-78cc-5ac2-955b-7137968ec8d0)
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1191] device (eth1): Activation: starting connection 'ci-private-network' (dd13faa5-78cc-5ac2-955b-7137968ec8d0)
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1192] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1194] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1199] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1205] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1257] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1259] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:49:58 np0005539504 NetworkManager[7197]: <info>  [1764395398.1263] device (eth1): Activation: successful, device activated.
Nov 29 00:50:08 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:50:12 np0005539504 systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 00:50:12 np0005539504 systemd[1]: session-3.scope: Consumed 1.757s CPU time.
Nov 29 00:50:12 np0005539504 systemd-logind[783]: Session 3 logged out. Waiting for processes to exit.
Nov 29 00:50:12 np0005539504 systemd-logind[783]: Removed session 3.
Nov 29 00:50:34 np0005539504 systemd[1]: Starting dnf makecache...
Nov 29 00:50:34 np0005539504 dnf[7303]: Failed determining last makecache time.
Nov 29 00:50:35 np0005539504 dnf[7303]: CentOS Stream 9 - BaseOS                         29 kB/s | 7.3 kB     00:00
Nov 29 00:50:35 np0005539504 dnf[7303]: CentOS Stream 9 - AppStream                      83 kB/s | 7.4 kB     00:00
Nov 29 00:50:35 np0005539504 dnf[7303]: CentOS Stream 9 - CRB                            47 kB/s | 7.2 kB     00:00
Nov 29 00:50:35 np0005539504 dnf[7303]: CentOS Stream 9 - Extras packages                86 kB/s | 8.3 kB     00:00
Nov 29 00:50:35 np0005539504 dnf[7303]: Metadata cache created.
Nov 29 00:50:35 np0005539504 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 00:50:35 np0005539504 systemd[1]: Finished dnf makecache.
Nov 29 00:50:44 np0005539504 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 00:50:44 np0005539504 systemd-logind[783]: New session 4 of user zuul.
Nov 29 00:50:44 np0005539504 systemd[1]: Started Session 4 of User zuul.
Nov 29 00:50:44 np0005539504 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 00:50:44 np0005539504 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 00:50:44 np0005539504 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 00:50:44 np0005539504 python3[7395]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:50:45 np0005539504 python3[7468]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395444.4725235-365-146693367599748/source _original_basename=tmp9vdenkti follow=False checksum=202951a95d8ea5ab635db917083142dc6b9b32e4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:50:48 np0005539504 systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 00:50:48 np0005539504 systemd-logind[783]: Session 4 logged out. Waiting for processes to exit.
Nov 29 00:50:48 np0005539504 systemd-logind[783]: Removed session 4.
Nov 29 00:52:34 np0005539504 systemd[4305]: Created slice User Background Tasks Slice.
Nov 29 00:52:34 np0005539504 systemd[4305]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 00:52:34 np0005539504 systemd[4305]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 00:56:27 np0005539504 systemd-logind[783]: New session 5 of user zuul.
Nov 29 00:56:27 np0005539504 systemd[1]: Started Session 5 of User zuul.
Nov 29 00:56:27 np0005539504 python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-2d4e-d9c7-000000000ca4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:28 np0005539504 python3[7555]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:28 np0005539504 python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:28 np0005539504 python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:28 np0005539504 python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:29 np0005539504 python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:29 np0005539504 python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:56:30 np0005539504 python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395789.6225512-365-266065808738908/source _original_basename=tmp5begdbg7 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:31 np0005539504 python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 00:56:31 np0005539504 systemd[1]: Reloading.
Nov 29 00:56:31 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:56:32 np0005539504 python3[7916]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 00:56:33 np0005539504 python3[7942]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:33 np0005539504 python3[7970]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:34 np0005539504 python3[7998]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:34 np0005539504 python3[8026]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:34 np0005539504 python3[8053]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-2d4e-d9c7-000000000cab-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:35 np0005539504 python3[8083]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 00:56:38 np0005539504 systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 00:56:38 np0005539504 systemd[1]: session-5.scope: Consumed 4.470s CPU time.
Nov 29 00:56:38 np0005539504 systemd-logind[783]: Session 5 logged out. Waiting for processes to exit.
Nov 29 00:56:38 np0005539504 systemd-logind[783]: Removed session 5.
Nov 29 00:56:40 np0005539504 systemd-logind[783]: New session 6 of user zuul.
Nov 29 00:56:40 np0005539504 systemd[1]: Started Session 6 of User zuul.
Nov 29 00:56:41 np0005539504 python3[8116]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 00:57:09 np0005539504 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:09 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:21 np0005539504 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:21 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:32 np0005539504 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:32 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:33 np0005539504 setsebool[8176]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 00:57:33 np0005539504 setsebool[8176]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 00:57:46 np0005539504 kernel: SELinux:  Converting 388 SID table entries...
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:46 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:58:04 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:58:04 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 00:58:04 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 00:58:04 np0005539504 systemd[1]: Reloading.
Nov 29 00:58:04 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:58:04 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 00:58:07 np0005539504 python3[10301]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-965f-b5ec-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:58:07 np0005539504 kernel: evm: overlay not supported
Nov 29 00:58:07 np0005539504 systemd[4305]: Starting D-Bus User Message Bus...
Nov 29 00:58:07 np0005539504 dbus-broker-launch[11544]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 00:58:07 np0005539504 dbus-broker-launch[11544]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 00:58:07 np0005539504 systemd[4305]: Started D-Bus User Message Bus.
Nov 29 00:58:07 np0005539504 dbus-broker-lau[11544]: Ready
Nov 29 00:58:07 np0005539504 systemd[4305]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:58:07 np0005539504 systemd[4305]: Created slice Slice /user.
Nov 29 00:58:07 np0005539504 systemd[4305]: podman-11418.scope: unit configures an IP firewall, but not running as root.
Nov 29 00:58:07 np0005539504 systemd[4305]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 00:58:07 np0005539504 systemd[4305]: Started podman-11418.scope.
Nov 29 00:58:08 np0005539504 systemd[4305]: Started podman-pause-65d55bea.scope.
Nov 29 00:58:08 np0005539504 python3[12091]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:58:08 np0005539504 python3[12091]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 00:58:09 np0005539504 systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 00:58:09 np0005539504 systemd[1]: session-6.scope: Consumed 1min 9.086s CPU time.
Nov 29 00:58:09 np0005539504 systemd-logind[783]: Session 6 logged out. Waiting for processes to exit.
Nov 29 00:58:09 np0005539504 systemd-logind[783]: Removed session 6.
Nov 29 00:58:24 np0005539504 irqbalance[778]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 29 00:58:24 np0005539504 irqbalance[778]: IRQ 27 affinity is now unmanaged
Nov 29 00:58:34 np0005539504 systemd-logind[783]: New session 7 of user zuul.
Nov 29 00:58:34 np0005539504 systemd[1]: Started Session 7 of User zuul.
Nov 29 00:58:34 np0005539504 python3[21306]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:58:35 np0005539504 python3[21469]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:58:36 np0005539504 python3[21777]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539504.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 00:58:36 np0005539504 python3[22028]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:58:37 np0005539504 python3[22259]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:58:37 np0005539504 python3[22458]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395917.0142052-168-223160239998151/source _original_basename=tmp2o_qqqwx follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:58:38 np0005539504 python3[22791]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 29 00:58:38 np0005539504 systemd[1]: Starting Hostname Service...
Nov 29 00:58:38 np0005539504 systemd[1]: Started Hostname Service.
Nov 29 00:58:38 np0005539504 systemd-hostnamed[22887]: Changed pretty hostname to 'compute-1'
Nov 29 00:58:38 np0005539504 systemd-hostnamed[22887]: Hostname set to <compute-1> (static)
Nov 29 00:58:38 np0005539504 NetworkManager[7197]: <info>  [1764395918.7812] hostname: static hostname changed from "np0005539504.novalocal" to "compute-1"
Nov 29 00:58:38 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:58:38 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:58:39 np0005539504 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 00:58:39 np0005539504 systemd[1]: session-7.scope: Consumed 2.644s CPU time.
Nov 29 00:58:39 np0005539504 systemd-logind[783]: Session 7 logged out. Waiting for processes to exit.
Nov 29 00:58:39 np0005539504 systemd-logind[783]: Removed session 7.
Nov 29 00:58:44 np0005539504 irqbalance[778]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 29 00:58:44 np0005539504 irqbalance[778]: IRQ 26 affinity is now unmanaged
Nov 29 00:58:48 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:59:02 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 00:59:02 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 00:59:02 np0005539504 systemd[1]: man-db-cache-update.service: Consumed 1min 10.781s CPU time.
Nov 29 00:59:02 np0005539504 systemd[1]: run-rcdc38e7377ec4bf7a2287bbb4cd7c473.service: Deactivated successfully.
Nov 29 00:59:08 np0005539504 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 01:02:57 np0005539504 systemd-logind[783]: New session 8 of user zuul.
Nov 29 01:02:57 np0005539504 systemd[1]: Started Session 8 of User zuul.
Nov 29 01:02:58 np0005539504 python3[30022]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:03:01 np0005539504 python3[30138]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:01 np0005539504 python3[30211]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:01 np0005539504 python3[30237]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:02 np0005539504 python3[30310]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:02 np0005539504 python3[30336]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:02 np0005539504 python3[30409]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:03 np0005539504 python3[30435]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:03 np0005539504 python3[30508]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:03 np0005539504 python3[30534]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:04 np0005539504 python3[30607]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:04 np0005539504 python3[30633]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:04 np0005539504 python3[30706]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:05 np0005539504 python3[30732]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:05 np0005539504 python3[30805]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6863236-34085-57078216822253/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:14 np0005539504 python3[30853]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:08:13 np0005539504 systemd-logind[783]: Session 8 logged out. Waiting for processes to exit.
Nov 29 01:08:13 np0005539504 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 01:08:13 np0005539504 systemd[1]: session-8.scope: Consumed 5.759s CPU time.
Nov 29 01:08:13 np0005539504 systemd-logind[783]: Removed session 8.
Nov 29 01:17:06 np0005539504 systemd-logind[783]: New session 9 of user zuul.
Nov 29 01:17:06 np0005539504 systemd[1]: Started Session 9 of User zuul.
Nov 29 01:17:07 np0005539504 python3.9[31070]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:09 np0005539504 python3.9[31253]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:17:17 np0005539504 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 01:17:17 np0005539504 systemd[1]: session-9.scope: Consumed 8.858s CPU time.
Nov 29 01:17:17 np0005539504 systemd-logind[783]: Session 9 logged out. Waiting for processes to exit.
Nov 29 01:17:17 np0005539504 systemd-logind[783]: Removed session 9.
Nov 29 01:17:33 np0005539504 systemd-logind[783]: New session 10 of user zuul.
Nov 29 01:17:33 np0005539504 systemd[1]: Started Session 10 of User zuul.
Nov 29 01:17:33 np0005539504 python3.9[31468]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 01:17:35 np0005539504 python3.9[31642]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:36 np0005539504 python3.9[31794]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:17:37 np0005539504 python3.9[31947]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:17:38 np0005539504 python3.9[32099]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:17:39 np0005539504 python3.9[32251]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:17:39 np0005539504 python3.9[32374]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397058.6257708-183-154062231666326/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:17:40 np0005539504 python3.9[32526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:41 np0005539504 python3.9[32682]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:17:42 np0005539504 python3.9[32834]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:17:43 np0005539504 python3.9[32984]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:17:47 np0005539504 python3.9[33239]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:17:48 np0005539504 python3.9[33389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:49 np0005539504 python3.9[33543]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:50 np0005539504 python3.9[33701]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:17:51 np0005539504 python3.9[33785]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:18:39 np0005539504 systemd[1]: Reloading.
Nov 29 01:18:39 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:39 np0005539504 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 01:18:40 np0005539504 systemd[1]: Reloading.
Nov 29 01:18:40 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:40 np0005539504 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 01:18:40 np0005539504 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 01:18:40 np0005539504 systemd[1]: Reloading.
Nov 29 01:18:40 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:40 np0005539504 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 01:18:41 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:18:41 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:18:41 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:19:54 np0005539504 kernel: SELinux:  Converting 2717 SID table entries...
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:19:54 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:19:54 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 01:19:54 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:19:54 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:19:54 np0005539504 systemd[1]: Reloading.
Nov 29 01:19:54 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:54 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:19:56 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:19:56 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:19:56 np0005539504 systemd[1]: man-db-cache-update.service: Consumed 1.415s CPU time.
Nov 29 01:19:56 np0005539504 systemd[1]: run-r98d920ea32b54255b2f6f8ff52298c32.service: Deactivated successfully.
Nov 29 01:20:01 np0005539504 python3.9[35346]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:20:04 np0005539504 python3.9[35629]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 01:20:05 np0005539504 python3.9[35782]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 01:20:13 np0005539504 python3.9[35939]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:20:15 np0005539504 python3.9[36093]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 01:20:19 np0005539504 python3.9[36248]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:27 np0005539504 python3.9[36400]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:20:28 np0005539504 python3.9[36523]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397219.87223-672-35426807648851/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:20:29 np0005539504 python3.9[36675]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:20:30 np0005539504 python3.9[36827]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:20:31 np0005539504 python3.9[36980]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:20:32 np0005539504 python3.9[37132]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 01:20:32 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:20:32 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:20:33 np0005539504 python3.9[37286]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:20:34 np0005539504 python3.9[37444]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:20:36 np0005539504 python3.9[37604]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 01:20:36 np0005539504 python3.9[37757]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:20:37 np0005539504 python3.9[37915]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 01:20:38 np0005539504 python3.9[38067]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:20:42 np0005539504 python3.9[38221]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:43 np0005539504 python3.9[38373]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:20:44 np0005539504 python3.9[38496]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397242.976406-1030-229293921970121/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:45 np0005539504 python3.9[38648]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:20:45 np0005539504 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:20:45 np0005539504 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 01:20:45 np0005539504 kernel: Bridge firewalling registered
Nov 29 01:20:45 np0005539504 systemd-modules-load[38652]: Inserted module 'br_netfilter'
Nov 29 01:20:45 np0005539504 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:20:46 np0005539504 python3.9[38807]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:20:46 np0005539504 python3.9[38930]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397245.7154617-1098-55221029636523/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:47 np0005539504 python3.9[39082]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:20:51 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:20:51 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:20:52 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:20:52 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:20:52 np0005539504 systemd[1]: Reloading.
Nov 29 01:20:52 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:52 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:20:55 np0005539504 python3.9[41917]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:20:56 np0005539504 python3.9[42815]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 01:20:56 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:20:56 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:20:56 np0005539504 systemd[1]: man-db-cache-update.service: Consumed 5.601s CPU time.
Nov 29 01:20:56 np0005539504 systemd[1]: run-ra71208b81d8b4fa0bf833d664617fcda.service: Deactivated successfully.
Nov 29 01:20:57 np0005539504 python3.9[43158]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:20:58 np0005539504 python3.9[43310]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:20:58 np0005539504 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:20:58 np0005539504 systemd[1]: Starting Authorization Manager...
Nov 29 01:20:58 np0005539504 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:20:58 np0005539504 polkitd[43527]: Started polkitd version 0.117
Nov 29 01:20:58 np0005539504 systemd[1]: Started Authorization Manager.
Nov 29 01:20:59 np0005539504 python3.9[43697]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:20:59 np0005539504 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 01:20:59 np0005539504 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 01:20:59 np0005539504 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 01:20:59 np0005539504 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:21:00 np0005539504 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:21:00 np0005539504 python3.9[43859]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 01:21:04 np0005539504 python3.9[44013]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:21:04 np0005539504 systemd[1]: Reloading.
Nov 29 01:21:04 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:05 np0005539504 python3.9[44203]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:21:05 np0005539504 systemd[1]: Reloading.
Nov 29 01:21:05 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:06 np0005539504 python3.9[44392]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:07 np0005539504 python3.9[44545]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:07 np0005539504 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 01:21:08 np0005539504 python3.9[44698]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:10 np0005539504 python3.9[44860]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:11 np0005539504 python3.9[45013]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:21:11 np0005539504 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 01:21:11 np0005539504 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 01:21:11 np0005539504 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 01:21:11 np0005539504 systemd[1]: Starting Apply Kernel Variables...
Nov 29 01:21:11 np0005539504 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 01:21:11 np0005539504 systemd[1]: Finished Apply Kernel Variables.
Nov 29 01:21:11 np0005539504 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 01:21:11 np0005539504 systemd[1]: session-10.scope: Consumed 2min 29.634s CPU time.
Nov 29 01:21:11 np0005539504 systemd-logind[783]: Session 10 logged out. Waiting for processes to exit.
Nov 29 01:21:11 np0005539504 systemd-logind[783]: Removed session 10.
Nov 29 01:21:16 np0005539504 systemd-logind[783]: New session 11 of user zuul.
Nov 29 01:21:16 np0005539504 systemd[1]: Started Session 11 of User zuul.
Nov 29 01:21:17 np0005539504 python3.9[45200]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:18 np0005539504 python3.9[45354]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:20 np0005539504 python3.9[45512]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:22 np0005539504 python3.9[45663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:23 np0005539504 python3.9[45823]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:21:23 np0005539504 python3.9[45907]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:21:26 np0005539504 python3.9[46060]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:21:27 np0005539504 python3.9[46231]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:21:28 np0005539504 python3.9[46383]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:28 np0005539504 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck3899736199-merged.mount: Deactivated successfully.
Nov 29 01:21:29 np0005539504 podman[46384]: 2025-11-29 06:21:29.174782093 +0000 UTC m=+0.754577242 system refresh
Nov 29 01:21:29 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:21:30 np0005539504 python3.9[46545]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:21:30 np0005539504 python3.9[46668]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397289.3993216-293-209035360255602/.source.json follow=False _original_basename=podman_network_config.j2 checksum=0a181540ddd4011e04769f696b0c6314729dd2f5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:21:31 np0005539504 python3.9[46820]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:21:32 np0005539504 python3.9[46943]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397290.9238608-338-182632447590520/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:32 np0005539504 python3.9[47095]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:33 np0005539504 python3.9[47247]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:34 np0005539504 python3.9[47399]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:34 np0005539504 python3.9[47551]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:36 np0005539504 python3.9[47701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:36 np0005539504 python3.9[47855]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:39 np0005539504 python3.9[48008]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:43 np0005539504 python3.9[48168]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:45 np0005539504 python3.9[48321]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:49 np0005539504 python3.9[48474]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:52 np0005539504 python3.9[48630]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:57 np0005539504 python3.9[48799]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:59 np0005539504 python3.9[48952]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:22:23 np0005539504 python3.9[49295]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:22:25 np0005539504 python3.9[49453]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:22:26 np0005539504 python3.9[49628]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:22:27 np0005539504 python3.9[49751]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764397346.1647635-782-276839957033802/.source.json _original_basename=.yoyms68d follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:22:28 np0005539504 python3.9[49903]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:22:28 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay-compat2678491888-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 01:22:39 np0005539504 podman[49914]: 2025-11-29 06:22:39.244447915 +0000 UTC m=+10.585955022 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:22:39 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:39 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:39 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:45 np0005539504 python3.9[50217]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:22:45 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:47 np0005539504 podman[50229]: 2025-11-29 06:22:47.855182697 +0000 UTC m=+2.605564492 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:22:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:49 np0005539504 python3.9[50464]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:22:49 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:13 np0005539504 podman[50477]: 2025-11-29 06:23:13.637297389 +0000 UTC m=+24.223061817 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:23:13 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:13 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:13 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:30 np0005539504 python3.9[50761]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:23:30 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:33 np0005539504 podman[50773]: 2025-11-29 06:23:33.53692871 +0000 UTC m=+3.058372448 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 29 01:23:33 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:33 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:33 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:34 np0005539504 python3.9[51027]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:23:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:35 np0005539504 podman[51040]: 2025-11-29 06:23:35.890065107 +0000 UTC m=+1.203924866 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 29 01:23:35 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:35 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:36 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:43 np0005539504 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 01:23:43 np0005539504 systemd[1]: session-11.scope: Consumed 2min 5.439s CPU time.
Nov 29 01:23:43 np0005539504 systemd-logind[783]: Session 11 logged out. Waiting for processes to exit.
Nov 29 01:23:43 np0005539504 systemd-logind[783]: Removed session 11.
Nov 29 01:23:49 np0005539504 systemd-logind[783]: New session 12 of user zuul.
Nov 29 01:23:49 np0005539504 systemd[1]: Started Session 12 of User zuul.
Nov 29 01:23:51 np0005539504 python3.9[51339]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:23:52 np0005539504 python3.9[51495]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 01:23:54 np0005539504 python3.9[51648]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:24:03 np0005539504 python3.9[51810]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:24:06 np0005539504 python3.9[51971]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:24:07 np0005539504 python3.9[52056]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:24:13 np0005539504 python3.9[52218]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:31 np0005539504 kernel: SELinux:  Converting 2730 SID table entries...
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:24:31 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:24:34 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 01:24:34 np0005539504 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 01:24:37 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:24:37 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:24:37 np0005539504 systemd[1]: Reloading.
Nov 29 01:24:37 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:24:37 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:24:37 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:24:40 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:24:40 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:24:40 np0005539504 systemd[1]: run-racc86f69e7514846b5d7d179322c8a8f.service: Deactivated successfully.
Nov 29 01:24:41 np0005539504 python3.9[53328]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:24:41 np0005539504 systemd[1]: Reloading.
Nov 29 01:24:41 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:24:41 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:24:42 np0005539504 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 01:24:42 np0005539504 chown[53371]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 01:24:42 np0005539504 ovs-ctl[53376]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 01:24:42 np0005539504 ovs-ctl[53376]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 01:24:42 np0005539504 ovs-ctl[53376]: Starting ovsdb-server [  OK  ]
Nov 29 01:24:42 np0005539504 ovs-vsctl[53426]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 01:24:42 np0005539504 ovs-vsctl[53446]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"a43628b3-9efd-4940-9509-686038e16aeb\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 01:24:42 np0005539504 ovs-ctl[53376]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 01:24:42 np0005539504 ovs-ctl[53376]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:24:42 np0005539504 ovs-vsctl[53452]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 29 01:24:42 np0005539504 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 01:24:42 np0005539504 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 01:24:42 np0005539504 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 01:24:42 np0005539504 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 01:24:42 np0005539504 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 01:24:42 np0005539504 ovs-ctl[53497]: Inserting openvswitch module [  OK  ]
Nov 29 01:24:42 np0005539504 ovs-ctl[53466]: Starting ovs-vswitchd [  OK  ]
Nov 29 01:24:42 np0005539504 ovs-vsctl[53514]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 29 01:24:42 np0005539504 ovs-ctl[53466]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:24:42 np0005539504 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 01:24:42 np0005539504 systemd[1]: Starting Open vSwitch...
Nov 29 01:24:42 np0005539504 systemd[1]: Finished Open vSwitch.
Nov 29 01:24:43 np0005539504 python3.9[53666]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:44 np0005539504 python3.9[53818]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 01:24:46 np0005539504 kernel: SELinux:  Converting 2744 SID table entries...
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:24:46 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:24:47 np0005539504 python3.9[53974]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:48 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 01:24:48 np0005539504 python3.9[54132]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:53 np0005539504 python3.9[54285]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:24:54 np0005539504 python3.9[54572]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:24:55 np0005539504 python3.9[54722]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:24:56 np0005539504 python3.9[54876]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:58 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:24:58 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:24:58 np0005539504 systemd[1]: Reloading.
Nov 29 01:24:58 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:24:58 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:24:58 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:24:58 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:24:58 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:24:58 np0005539504 systemd[1]: run-r8af830d232d1479b903067b8d6c36805.service: Deactivated successfully.
Nov 29 01:25:02 np0005539504 python3.9[55192]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:25:02 np0005539504 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 01:25:02 np0005539504 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 01:25:02 np0005539504 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 01:25:02 np0005539504 NetworkManager[7197]: <info>  [1764397502.7994] caught SIGTERM, shutting down normally.
Nov 29 01:25:02 np0005539504 NetworkManager[7197]: <info>  [1764397502.8013] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:25:02 np0005539504 NetworkManager[7197]: <info>  [1764397502.8013] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:25:02 np0005539504 NetworkManager[7197]: <info>  [1764397502.8013] dhcp4 (eth0): state changed no lease
Nov 29 01:25:02 np0005539504 NetworkManager[7197]: <info>  [1764397502.8016] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:25:02 np0005539504 systemd[1]: Stopping Network Manager...
Nov 29 01:25:02 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:25:02 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:25:03 np0005539504 NetworkManager[7197]: <info>  [1764397503.2389] exiting (success)
Nov 29 01:25:03 np0005539504 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 01:25:03 np0005539504 systemd[1]: Stopped Network Manager.
Nov 29 01:25:03 np0005539504 systemd[1]: NetworkManager.service: Consumed 16.669s CPU time, 4.3M memory peak, read 0B from disk, written 42.0K to disk.
Nov 29 01:25:03 np0005539504 systemd[1]: Starting Network Manager...
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.3165] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:62dd3eed-5b38-4c74-8c8b-95416b2c294d)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.3166] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.3234] manager[0x56395dc16090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 01:25:03 np0005539504 systemd[1]: Starting Hostname Service...
Nov 29 01:25:03 np0005539504 systemd[1]: Started Hostname Service.
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4267] hostname: hostname: using hostnamed
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4268] hostname: static hostname changed from (none) to "compute-1"
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4274] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4282] manager[0x56395dc16090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4282] manager[0x56395dc16090]: rfkill: WWAN hardware radio set enabled
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4305] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4314] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4315] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4315] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4315] manager: Networking is enabled by state file
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4318] settings: Loaded settings plugin: keyfile (internal)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4321] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4348] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4359] dhcp: init: Using DHCP client 'internal'
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4361] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4366] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4371] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4380] device (lo): Activation: starting connection 'lo' (90b43410-4648-4f39-847b-37821e0dfc83)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4387] device (eth0): carrier: link connected
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4390] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4395] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4396] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4402] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4409] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4417] device (eth1): carrier: link connected
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4420] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4425] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (dd13faa5-78cc-5ac2-955b-7137968ec8d0) (indicated)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4425] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4430] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4436] device (eth1): Activation: starting connection 'ci-private-network' (dd13faa5-78cc-5ac2-955b-7137968ec8d0)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4442] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 01:25:03 np0005539504 systemd[1]: Started Network Manager.
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4450] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4452] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4454] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4456] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4459] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4461] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4463] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4467] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4474] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4477] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4486] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4498] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4510] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.4517] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 01:25:03 np0005539504 systemd[1]: Starting Network Manager Wait Online...
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5082] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5094] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5099] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5100] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5107] device (lo): Activation: successful, device activated.
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5112] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5116] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5118] device (eth1): Activation: successful, device activated.
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5129] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5130] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5133] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5135] device (eth0): Activation: successful, device activated.
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5141] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 01:25:03 np0005539504 NetworkManager[55210]: <info>  [1764397503.5192] manager: startup complete
Nov 29 01:25:03 np0005539504 systemd[1]: Finished Network Manager Wait Online.
Nov 29 01:25:05 np0005539504 python3.9[55419]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:13 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:25:17 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:25:17 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:25:17 np0005539504 systemd[1]: Reloading.
Nov 29 01:25:18 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:25:18 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:25:18 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:25:23 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:25:23 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:25:23 np0005539504 systemd[1]: run-r15c8f709acc64c84a53ec1b257189b87.service: Deactivated successfully.
Nov 29 01:25:24 np0005539504 python3.9[55885]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:25:25 np0005539504 python3.9[56037]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:26 np0005539504 python3.9[56191]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:26 np0005539504 python3.9[56343]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:27 np0005539504 python3.9[56495]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:28 np0005539504 python3.9[56647]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:28 np0005539504 python3.9[56799]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:29 np0005539504 python3.9[56922]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397528.4738946-654-270338434080175/.source _original_basename=.uzqvgcuw follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:33 np0005539504 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 01:25:33 np0005539504 python3.9[57078]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:34 np0005539504 python3.9[57232]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 01:25:35 np0005539504 python3.9[57384]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:38 np0005539504 python3.9[57813]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 01:25:39 np0005539504 ansible-async_wrapper.py[57990]: Invoked with j800520167388 300 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.4665995-852-66916443273308/AnsiballZ_edpm_os_net_config.py _
Nov 29 01:25:39 np0005539504 ansible-async_wrapper.py[57993]: Starting module and watcher
Nov 29 01:25:39 np0005539504 ansible-async_wrapper.py[57993]: Start watching 57994 (300)
Nov 29 01:25:39 np0005539504 ansible-async_wrapper.py[57994]: Start module (57994)
Nov 29 01:25:39 np0005539504 ansible-async_wrapper.py[57990]: Return async_wrapper task started.
Nov 29 01:25:39 np0005539504 python3.9[57995]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 01:25:40 np0005539504 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 01:25:40 np0005539504 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 01:25:40 np0005539504 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 01:25:40 np0005539504 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 01:25:40 np0005539504 kernel: cfg80211: failed to load regulatory.db
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.5614] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.5629] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6156] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6158] audit: op="connection-add" uuid="ee841e09-f651-45f8-b4ff-0d4128c39c10" name="br-ex-br" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6177] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6179] audit: op="connection-add" uuid="c1837419-ec04-430e-b1be-c9a323630ce8" name="br-ex-port" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6193] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6195] audit: op="connection-add" uuid="690c4500-e5db-4809-a4e8-6f0371c79c7c" name="eth1-port" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6207] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6208] audit: op="connection-add" uuid="41114aa5-0c00-4410-bf06-0e93793f81a3" name="vlan20-port" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6222] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6224] audit: op="connection-add" uuid="a2a4f14a-d646-4b44-b1a8-9ee4b85b2f45" name="vlan21-port" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6237] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6238] audit: op="connection-add" uuid="7870deca-980f-4d6c-9018-802f0fe90a69" name="vlan22-port" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6265] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6284] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6285] audit: op="connection-add" uuid="6c214311-eaa4-4a5c-8f6e-e05eb63965dc" name="br-ex-if" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6669] audit: op="connection-update" uuid="dd13faa5-78cc-5ac2-955b-7137968ec8d0" name="ci-private-network" args="ovs-interface.type,ipv4.routing-rules,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.dns,ipv4.routes,ipv6.method,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.dns,ipv6.routes,connection.master,connection.port-type,connection.controller,connection.slave-type,connection.timestamp,ovs-external-ids.data" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6695] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6697] audit: op="connection-add" uuid="8ce6c609-bc63-49fb-a162-dfacb590b537" name="vlan20-if" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6715] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6716] audit: op="connection-add" uuid="80075f6e-be25-41bb-9647-4369a6b3f1fc" name="vlan21-if" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6735] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6736] audit: op="connection-add" uuid="5ab666d8-0486-4356-90f3-7410783a04be" name="vlan22-if" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6752] audit: op="connection-delete" uuid="08b8e8f2-c9d1-3cc8-8887-49c6b7206b68" name="Wired connection 1" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6767] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6782] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6788] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ee841e09-f651-45f8-b4ff-0d4128c39c10)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6789] audit: op="connection-activate" uuid="ee841e09-f651-45f8-b4ff-0d4128c39c10" name="br-ex-br" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6793] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6801] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6806] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (c1837419-ec04-430e-b1be-c9a323630ce8)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6810] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6818] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6826] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (690c4500-e5db-4809-a4e8-6f0371c79c7c)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6829] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6838] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6846] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (41114aa5-0c00-4410-bf06-0e93793f81a3)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6848] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6859] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6864] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (a2a4f14a-d646-4b44-b1a8-9ee4b85b2f45)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6868] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6878] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6886] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (7870deca-980f-4d6c-9018-802f0fe90a69)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6888] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6891] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6895] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6903] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6910] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6914] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (6c214311-eaa4-4a5c-8f6e-e05eb63965dc)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6915] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6919] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6921] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6923] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6924] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6939] device (eth1): disconnecting for new activation request.
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6940] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6944] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6947] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6948] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6951] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6956] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6961] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8ce6c609-bc63-49fb-a162-dfacb590b537)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6962] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6965] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6968] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6969] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6972] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6978] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6982] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (80075f6e-be25-41bb-9647-4369a6b3f1fc)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6983] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6986] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6988] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6990] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6994] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.6998] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7004] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (5ab666d8-0486-4356-90f3-7410783a04be)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7005] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7007] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7010] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7012] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7014] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7030] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority" pid=57996 uid=0 result="success"
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7035] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7038] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7040] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7049] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7052] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7057] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7060] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7062] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7067] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7071] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7075] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7076] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7081] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7086] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7089] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7091] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7096] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7101] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7101] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7101] dhcp4 (eth0): state changed no lease
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7102] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.7115] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57996 uid=0 result="fail" reason="Device is not activated"
Nov 29 01:25:41 np0005539504 kernel: ovs-system: entered promiscuous mode
Nov 29 01:25:41 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:25:41 np0005539504 kernel: Timeout policy base is empty
Nov 29 01:25:41 np0005539504 systemd-udevd[58000]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:25:41 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:25:41 np0005539504 kernel: br-ex: entered promiscuous mode
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.9102] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.9123] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.9144] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.9157] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.9171] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 01:25:41 np0005539504 kernel: vlan20: entered promiscuous mode
Nov 29 01:25:41 np0005539504 NetworkManager[55210]: <info>  [1764397541.9196] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:41 np0005539504 kernel: vlan22: entered promiscuous mode
Nov 29 01:25:41 np0005539504 systemd-udevd[58001]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:25:41 np0005539504 kernel: vlan21: entered promiscuous mode
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.0980] device (eth1): disconnecting for new activation request.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.0981] audit: op="connection-activate" uuid="dd13faa5-78cc-5ac2-955b-7137968ec8d0" name="ci-private-network" pid=57996 uid=0 result="success"
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.0981] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1101] device (eth1): Activation: starting connection 'ci-private-network' (dd13faa5-78cc-5ac2-955b-7137968ec8d0)
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1105] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1106] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1107] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1109] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1110] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1111] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1128] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1139] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1145] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1146] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1148] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1168] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1180] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1190] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1199] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1210] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1219] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1226] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1240] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1247] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1258] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1270] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1284] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1292] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57996 uid=0 result="success"
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1315] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1328] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1367] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1398] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1404] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1423] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1438] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1459] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1470] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1483] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1502] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1506] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1509] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1513] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1537] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1552] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1565] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1579] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1593] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1607] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1618] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539504 NetworkManager[55210]: <info>  [1764397542.1634] device (eth1): Activation: successful, device activated.
Nov 29 01:25:43 np0005539504 python3.9[58328]: ansible-ansible.legacy.async_status Invoked with jid=j800520167388.57990 mode=status _async_dir=/root/.ansible_async
Nov 29 01:25:43 np0005539504 NetworkManager[55210]: <info>  [1764397543.3412] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57996 uid=0 result="success"
Nov 29 01:25:43 np0005539504 NetworkManager[55210]: <info>  [1764397543.5533] checkpoint[0x56395dbec950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 01:25:43 np0005539504 NetworkManager[55210]: <info>  [1764397543.5537] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57996 uid=0 result="success"
Nov 29 01:25:43 np0005539504 NetworkManager[55210]: <info>  [1764397543.8932] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57996 uid=0 result="success"
Nov 29 01:25:43 np0005539504 NetworkManager[55210]: <info>  [1764397543.8948] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57996 uid=0 result="success"
Nov 29 01:25:44 np0005539504 NetworkManager[55210]: <info>  [1764397544.1716] audit: op="networking-control" arg="global-dns-configuration" pid=57996 uid=0 result="success"
Nov 29 01:25:44 np0005539504 NetworkManager[55210]: <info>  [1764397544.1754] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 01:25:44 np0005539504 NetworkManager[55210]: <info>  [1764397544.1787] audit: op="networking-control" arg="global-dns-configuration" pid=57996 uid=0 result="success"
Nov 29 01:25:44 np0005539504 NetworkManager[55210]: <info>  [1764397544.1811] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57996 uid=0 result="success"
Nov 29 01:25:44 np0005539504 ansible-async_wrapper.py[57993]: 57994 still running (300)
Nov 29 01:25:44 np0005539504 NetworkManager[55210]: <info>  [1764397544.3495] checkpoint[0x56395dbeca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 01:25:44 np0005539504 NetworkManager[55210]: <info>  [1764397544.3499] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57996 uid=0 result="success"
Nov 29 01:25:44 np0005539504 ansible-async_wrapper.py[57994]: Module complete (57994)
Nov 29 01:25:46 np0005539504 python3.9[58434]: ansible-ansible.legacy.async_status Invoked with jid=j800520167388.57990 mode=status _async_dir=/root/.ansible_async
Nov 29 01:25:47 np0005539504 python3.9[58534]: ansible-ansible.legacy.async_status Invoked with jid=j800520167388.57990 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 01:25:49 np0005539504 ansible-async_wrapper.py[57993]: Done in kid B.
Nov 29 01:25:51 np0005539504 python3.9[58687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:52 np0005539504 python3.9[58810]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397551.3208554-943-94472843787338/.source.returncode _original_basename=.z2r1k8na follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:53 np0005539504 python3.9[58963]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:54 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:25:54 np0005539504 python3.9[59087]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397553.3953369-991-218276139785084/.source.cfg _original_basename=.apmc42h6 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:55 np0005539504 python3.9[59239]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:25:55 np0005539504 systemd[1]: Reloading Network Manager...
Nov 29 01:25:55 np0005539504 NetworkManager[55210]: <info>  [1764397555.5122] audit: op="reload" arg="0" pid=59243 uid=0 result="success"
Nov 29 01:25:55 np0005539504 NetworkManager[55210]: <info>  [1764397555.5132] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 01:25:55 np0005539504 systemd[1]: Reloaded Network Manager.
Nov 29 01:25:55 np0005539504 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:25:55 np0005539504 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:25:56 np0005539504 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 01:25:56 np0005539504 systemd[1]: session-12.scope: Consumed 56.954s CPU time.
Nov 29 01:25:56 np0005539504 systemd-logind[783]: Session 12 logged out. Waiting for processes to exit.
Nov 29 01:25:56 np0005539504 systemd-logind[783]: Removed session 12.
Nov 29 01:26:01 np0005539504 systemd-logind[783]: New session 13 of user zuul.
Nov 29 01:26:01 np0005539504 systemd[1]: Started Session 13 of User zuul.
Nov 29 01:26:02 np0005539504 python3.9[59431]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:03 np0005539504 python3.9[59586]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:05 np0005539504 python3.9[59775]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:26:05 np0005539504 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:26:05 np0005539504 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 01:26:05 np0005539504 systemd[1]: session-13.scope: Consumed 2.557s CPU time.
Nov 29 01:26:05 np0005539504 systemd-logind[783]: Session 13 logged out. Waiting for processes to exit.
Nov 29 01:26:05 np0005539504 systemd-logind[783]: Removed session 13.
Nov 29 01:26:14 np0005539504 systemd-logind[783]: New session 14 of user zuul.
Nov 29 01:26:14 np0005539504 systemd[1]: Started Session 14 of User zuul.
Nov 29 01:26:15 np0005539504 python3.9[59957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:16 np0005539504 python3.9[60111]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:18 np0005539504 python3.9[60267]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:19 np0005539504 python3.9[60351]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:26:22 np0005539504 python3.9[60504]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:24 np0005539504 python3.9[60695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:25 np0005539504 python3.9[60847]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:26:25 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:26:26 np0005539504 python3.9[61013]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:26 np0005539504 python3.9[61091]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:27 np0005539504 python3.9[61243]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:28 np0005539504 python3.9[61321]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:29 np0005539504 python3.9[61473]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:29 np0005539504 python3.9[61625]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:30 np0005539504 python3.9[61777]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:31 np0005539504 python3.9[61929]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:32 np0005539504 python3.9[62081]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:26:37 np0005539504 python3.9[62236]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:38 np0005539504 python3.9[62390]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:26:39 np0005539504 python3.9[62544]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:26:40 np0005539504 python3.9[62698]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:26:41 np0005539504 python3.9[62851]: ansible-service_facts Invoked
Nov 29 01:26:41 np0005539504 network[62868]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:26:41 np0005539504 network[62869]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:26:41 np0005539504 network[62870]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:26:47 np0005539504 python3.9[63324]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:26:53 np0005539504 python3.9[63477]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 01:26:55 np0005539504 python3.9[63629]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:56 np0005539504 python3.9[63754]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397614.8614633-662-257323696848174/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:57 np0005539504 python3.9[63910]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:57 np0005539504 python3.9[64035]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397616.6541963-707-170994698075234/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:59 np0005539504 python3.9[64189]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:01 np0005539504 python3.9[64343]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:27:02 np0005539504 python3.9[64428]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:03 np0005539504 python3.9[64582]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:27:04 np0005539504 python3.9[64666]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:27:04 np0005539504 chronyd[799]: chronyd exiting
Nov 29 01:27:04 np0005539504 systemd[1]: Stopping NTP client/server...
Nov 29 01:27:04 np0005539504 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 01:27:04 np0005539504 systemd[1]: Stopped NTP client/server.
Nov 29 01:27:04 np0005539504 systemd[1]: Starting NTP client/server...
Nov 29 01:27:04 np0005539504 chronyd[64675]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 01:27:04 np0005539504 chronyd[64675]: Frequency -26.227 +/- 0.225 ppm read from /var/lib/chrony/drift
Nov 29 01:27:04 np0005539504 chronyd[64675]: Loaded seccomp filter (level 2)
Nov 29 01:27:04 np0005539504 systemd[1]: Started NTP client/server.
Nov 29 01:27:06 np0005539504 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 01:27:06 np0005539504 systemd[1]: session-14.scope: Consumed 28.113s CPU time.
Nov 29 01:27:06 np0005539504 systemd-logind[783]: Session 14 logged out. Waiting for processes to exit.
Nov 29 01:27:06 np0005539504 systemd-logind[783]: Removed session 14.
Nov 29 01:27:12 np0005539504 systemd-logind[783]: New session 15 of user zuul.
Nov 29 01:27:12 np0005539504 systemd[1]: Started Session 15 of User zuul.
Nov 29 01:27:13 np0005539504 python3.9[64855]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:14 np0005539504 python3.9[65011]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:15 np0005539504 python3.9[65186]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:15 np0005539504 python3.9[65264]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.4rq_86ry recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:16 np0005539504 python3.9[65416]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:17 np0005539504 python3.9[65539]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397636.3567643-149-30156806919801/.source _original_basename=.egzfz6sq follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:18 np0005539504 python3.9[65691]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:27:19 np0005539504 python3.9[65843]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:20 np0005539504 python3.9[65966]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397638.9474807-221-120212048968677/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:27:21 np0005539504 python3.9[66118]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:21 np0005539504 python3.9[66241]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397640.5075707-221-7408222459782/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:27:22 np0005539504 python3.9[66393]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:23 np0005539504 python3.9[66545]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:24 np0005539504 python3.9[66668]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397642.9878778-332-277763293676112/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:24 np0005539504 python3.9[66820]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:25 np0005539504 python3.9[66943]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397644.5081306-377-242158441300357/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:26 np0005539504 python3.9[67095]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:26 np0005539504 systemd[1]: Reloading.
Nov 29 01:27:26 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:26 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:27 np0005539504 systemd[1]: Reloading.
Nov 29 01:27:27 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:27 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:27 np0005539504 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 01:27:27 np0005539504 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 01:27:28 np0005539504 python3.9[67321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:28 np0005539504 python3.9[67444]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397647.6624308-446-220441328149392/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:29 np0005539504 python3.9[67596]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:30 np0005539504 python3.9[67719]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397648.987859-491-80817514342346/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:31 np0005539504 python3.9[67871]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:31 np0005539504 systemd[1]: Reloading.
Nov 29 01:27:31 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:31 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:31 np0005539504 systemd[1]: Reloading.
Nov 29 01:27:31 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:31 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:31 np0005539504 systemd[1]: Starting Create netns directory...
Nov 29 01:27:31 np0005539504 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:27:31 np0005539504 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:27:31 np0005539504 systemd[1]: Finished Create netns directory.
Nov 29 01:27:32 np0005539504 python3.9[68098]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:27:32 np0005539504 network[68115]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:27:32 np0005539504 network[68116]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:27:32 np0005539504 network[68117]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:27:36 np0005539504 python3.9[68379]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:36 np0005539504 systemd[1]: Reloading.
Nov 29 01:27:36 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:36 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:36 np0005539504 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 01:27:36 np0005539504 iptables.init[68418]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 01:27:36 np0005539504 iptables.init[68418]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 01:27:36 np0005539504 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 01:27:36 np0005539504 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 01:27:37 np0005539504 python3.9[68615]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:38 np0005539504 python3.9[68769]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:38 np0005539504 systemd[1]: Reloading.
Nov 29 01:27:38 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:38 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:38 np0005539504 systemd[1]: Starting Netfilter Tables...
Nov 29 01:27:38 np0005539504 systemd[1]: Finished Netfilter Tables.
Nov 29 01:27:39 np0005539504 python3.9[68963]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:40 np0005539504 python3.9[69116]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:41 np0005539504 python3.9[69241]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397660.4192965-698-57058890253238/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:42 np0005539504 python3.9[69394]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:27:42 np0005539504 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 01:27:42 np0005539504 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 01:27:43 np0005539504 python3.9[69550]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:44 np0005539504 python3.9[69702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:44 np0005539504 python3.9[69825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397663.5339785-791-4009534988704/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:45 np0005539504 python3.9[69977]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 01:27:45 np0005539504 systemd[1]: Starting Time & Date Service...
Nov 29 01:27:45 np0005539504 systemd[1]: Started Time & Date Service.
Nov 29 01:27:46 np0005539504 python3.9[70135]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:47 np0005539504 python3.9[70287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:48 np0005539504 python3.9[70414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397667.2897992-897-246654450950243/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:49 np0005539504 python3.9[70566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:49 np0005539504 python3.9[70689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397668.7127354-941-236752685182687/.source.yaml _original_basename=._9ih862k follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:50 np0005539504 python3.9[70841]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:51 np0005539504 python3.9[70964]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397670.2268918-986-59947510827973/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:52 np0005539504 python3.9[71116]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:52 np0005539504 python3.9[71269]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:53 np0005539504 python3[71422]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:27:54 np0005539504 python3.9[71574]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:55 np0005539504 python3.9[71697]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397673.9850512-1103-84118725422759/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:55 np0005539504 python3.9[71849]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:56 np0005539504 python3.9[71972]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397675.2885892-1148-222942371272814/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:57 np0005539504 python3.9[72124]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:57 np0005539504 python3.9[72247]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397676.6454816-1193-141781380187552/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:58 np0005539504 python3.9[72399]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:59 np0005539504 python3.9[72522]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397678.0648758-1239-178935876275102/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:00 np0005539504 python3.9[72674]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:00 np0005539504 python3.9[72797]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397679.4536662-1283-205962327592908/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:01 np0005539504 python3.9[72949]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:02 np0005539504 python3.9[73101]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:03 np0005539504 python3.9[73260]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:03 np0005539504 python3.9[73413]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:04 np0005539504 python3.9[73565]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:06 np0005539504 python3.9[73717]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:28:06 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:28:06 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:28:07 np0005539504 python3.9[73871]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:28:07 np0005539504 systemd-logind[783]: Session 15 logged out. Waiting for processes to exit.
Nov 29 01:28:07 np0005539504 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 01:28:07 np0005539504 systemd[1]: session-15.scope: Consumed 39.485s CPU time.
Nov 29 01:28:07 np0005539504 systemd-logind[783]: Removed session 15.
Nov 29 01:28:13 np0005539504 systemd-logind[783]: New session 16 of user zuul.
Nov 29 01:28:13 np0005539504 systemd[1]: Started Session 16 of User zuul.
Nov 29 01:28:14 np0005539504 python3.9[74054]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 01:28:15 np0005539504 python3.9[74206]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:16 np0005539504 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 01:28:16 np0005539504 python3.9[74360]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:17 np0005539504 python3.9[74512]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDxE9+in1YAsVzo2PkbOP/y9jW13mE04+F1VrPVmmgKUME6PWRBUtuT66AB40zRYi5yO+6N76+VAJcvtF1kNGhm3shwR+EkOKx8SHbU+RviKmRHfsi7XEfyHL7uPXOJMckqz85eUFqMQlXm0T6k8SbAwg/7v0r7w70oz6RysylzQYZWVeFgXZ7UFNiz+TKXL4x8MRY/6V3JMXIBdt/vb6cGmIyDwfTLPa/VxO6oKiuknrmAhd6pKWVOAoLeLvCJFRcnCjfZygatiRnwzibR7Xmo/fWClfIWB/RpJC5vSGru0Y/btrmoNInBd93XAWFRh8/+L/mTAUqvgP7Dy/Ft6JXARlkcmX64/tqwMI7M6a4A9voOZ8Eb1cJyJ/XgWoTXUZB9+cehvGP5J0tLJkw/iGBXKOcXhP99ulw5rvtkAaOXV6omaio88Pl85lT2ISJO6g47/pk27eMMKNXxMdNlhqVOtR5zLQHv3t0Pvd9/HFZhfcx1w86u5aR+V9irnyt3WAc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB2l802ocmKW/xzYye+Pzw89MQvA5jQh5a0yLK2ZyZCd#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI7XJn2j/ECeKq3mKYHO54Bh/Op2+6G6UX6ad7xn+hglSDuDDZy9KOJY974X6YapBGPsvID5GfLpKZuusj2w6cw=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDSD8ZpMWbyfWwat32zE3dwK32EyLj7Y+//yic/Bd8bh7jSKBLK9ym42oYT01mO+NFTdefo2ARchFmERRxMzut4oUrqMlfhrn+mNHsvLaQycoAg+oq19ivJki9YXqDUIR0GwObpBRBSVczn15OcfSZNvJ+5yEWYWoeMXyjR7IpLeP4unrXYU5Gefx+ixYfHqq9U2klSr1mLGklHOYT1257UfS7aFtDHfrGqNLBghhbpbjLBljCPzwbz2JHg+8oO3x0s19DpnMBT0ID3emGqK2CRupsBeiWpZYUfcIDbCqmgcmC5QRkORpTRfGSYdDcsqSjpDOkPShwf1Le1r5QnW7JiFsy0ogLQ0ThcibSAVqVQZpFDROMSTPeqUlnDDqklZEtTgARcUGiVhmiXhR8sIdJXzJ5b1IB28Y3jGlf6kmQpBa9raXRegF/7J3SWDcOHO/sYe7Wh50S0cBgRgix0492hkGz3icxCzNwpQ5H/dTKdLCX7SvWyn/dHYE7411EP0Xc=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIARLjHbwtuz0VGhEJnZ8jUcmug4YEziBMgu/+Q2Xf/qr#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN/4QosKjhedc/jgjDOXpXhsciLiDd+ILxSMZxLO5NzR72Gm5KH5lEdveLrailDwVrIBl1+UjfksCNfnn+zVt1w=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs1lxOj+O3cXQh+L6Hvro0WUX7vGdONQb0UkjJDqrzMWuP0tmX4CuMYeN2kUtGqc5U1dKriurXmo1qGVTvVz1rFJWYr1e1qwcv/DCLijB+4QR8oi61K8+nnWm47XeUoyWOI1GxkiHPeLPUs3QDDbHClDRGD9SWUQ5AtaO0NqAPalgp4eYChWy0Y4soQNnOXqbjnwEsJRK85/mXhogmZpALrFBu87oJtbviSxczqa+4bci7R6jWZ+ZkZbKw2+D3QskWWoHcgFgQVCprAXuj/ebUq1gyCY/d+tnyQs80H9XZ6Ryvmu1e7zEhKJvldu5mAamd8l4EwL79yt1ds7cSRXEH/+ajyYpXXTerzMFIsItjkdt+fg8DiheTqZexiHXvykMSjhPdshC1A9JWSsD+ISIR5qLPmHx5g3kZyVt5WM3mPfqh8WYsG4FM7EzMz492DnLUqdIsJXOBPjExJZhCLYvOdjJI5hMYHQ2GTE4ZlW0rvYr85xi12yOn9K3zmZ6q2SU=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOVwi6LOnwRGXKTYlL5FohHpKT05ra2BKYgm2kBQxP+u#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOCcnxH2XsLxMcRRLaA4DruLY3oryYRdOPfwLiZD7s7kBHBXt+svOGk0QImtaVEKV/k9369qMK8GrFyzO2efaCk=#012 create=True mode=0644 path=/tmp/ansible.0qc_z15j state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:18 np0005539504 python3.9[74664]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.0qc_z15j' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:19 np0005539504 python3.9[74818]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.0qc_z15j state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:20 np0005539504 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 01:28:20 np0005539504 systemd[1]: session-16.scope: Consumed 3.658s CPU time.
Nov 29 01:28:20 np0005539504 systemd-logind[783]: Session 16 logged out. Waiting for processes to exit.
Nov 29 01:28:20 np0005539504 systemd-logind[783]: Removed session 16.
Nov 29 01:28:26 np0005539504 systemd-logind[783]: New session 17 of user zuul.
Nov 29 01:28:26 np0005539504 systemd[1]: Started Session 17 of User zuul.
Nov 29 01:28:27 np0005539504 python3.9[74998]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:29 np0005539504 python3.9[75154]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:28:30 np0005539504 python3.9[75308]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:28:31 np0005539504 python3.9[75461]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:32 np0005539504 python3.9[75614]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:32 np0005539504 python3.9[75768]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:33 np0005539504 python3.9[75923]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:34 np0005539504 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 01:28:34 np0005539504 systemd[1]: session-17.scope: Consumed 5.136s CPU time.
Nov 29 01:28:34 np0005539504 systemd-logind[783]: Session 17 logged out. Waiting for processes to exit.
Nov 29 01:28:34 np0005539504 systemd-logind[783]: Removed session 17.
Nov 29 01:28:39 np0005539504 systemd-logind[783]: New session 18 of user zuul.
Nov 29 01:28:39 np0005539504 systemd[1]: Started Session 18 of User zuul.
Nov 29 01:28:40 np0005539504 python3.9[76102]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:42 np0005539504 python3.9[76258]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:28:43 np0005539504 python3.9[76342]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:28:45 np0005539504 python3.9[76493]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:47 np0005539504 python3.9[76644]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:28:47 np0005539504 python3.9[76794]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:48 np0005539504 python3.9[76944]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:49 np0005539504 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 01:28:49 np0005539504 systemd[1]: session-18.scope: Consumed 6.401s CPU time.
Nov 29 01:28:49 np0005539504 systemd-logind[783]: Session 18 logged out. Waiting for processes to exit.
Nov 29 01:28:49 np0005539504 systemd-logind[783]: Removed session 18.
Nov 29 01:28:55 np0005539504 systemd-logind[783]: New session 19 of user zuul.
Nov 29 01:28:55 np0005539504 systemd[1]: Started Session 19 of User zuul.
Nov 29 01:28:57 np0005539504 python3.9[77129]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:59 np0005539504 python3.9[77285]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:00 np0005539504 python3.9[77437]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:00 np0005539504 python3.9[77589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:01 np0005539504 python3.9[77712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397740.3728597-167-138313872258147/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=28fb72f0b91e0283b8be23e2289ca1d4a8805bbf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:02 np0005539504 python3.9[77864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:03 np0005539504 python3.9[77987]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397741.8493314-167-141778134755130/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=53ec2692be9fe7fa10ffde7cdba9150c4076f3fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:03 np0005539504 python3.9[78139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:04 np0005539504 python3.9[78262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397743.2297752-167-5484833427821/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=c90e9cef8629f28ec12865306cc90460ec8f783f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:05 np0005539504 python3.9[78414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539504 python3.9[78568]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539504 python3.9[78720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:07 np0005539504 python3.9[78843]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397746.4629881-366-261958362554878/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=e7e7bf0b26d6a421b0672e3d68b7c04986c0f654 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:08 np0005539504 python3.9[78995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:08 np0005539504 python3.9[79118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397747.645826-366-26091847084772/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=bf4291d96f7c0f5cd858ccf4f424f476f6c02cd9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:09 np0005539504 python3.9[79270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:12 np0005539504 python3.9[79393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397748.8018785-366-198884630034477/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=bceedbca0a4a235c44213c5fab1a73ed91a89985 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:13 np0005539504 python3.9[79545]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:14 np0005539504 python3.9[79697]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:14 np0005539504 python3.9[79849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:14 np0005539504 chronyd[64675]: Selected source 198.181.199.86 (pool.ntp.org)
Nov 29 01:29:15 np0005539504 python3.9[79972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397754.36071-620-235462641806072/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=68389ddc3433c81787e2032632462afda5f8e320 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:16 np0005539504 python3.9[80124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:16 np0005539504 python3.9[80247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397755.5827913-620-70136753296769/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d5ce9fc6df7543706791229321e0116a703016b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:17 np0005539504 python3.9[80399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:17 np0005539504 python3.9[80522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397756.822255-620-199725643182538/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9931bc41ada83a2b5ce963dee93218e120ff651a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:18 np0005539504 python3.9[80674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:19 np0005539504 python3.9[80826]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:19 np0005539504 python3.9[80978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:20 np0005539504 python3.9[81101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397759.3766165-801-259447271318531/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=f6f71a883645c3c382ebc805dcfe42b63e78d082 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:21 np0005539504 python3.9[81255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:21 np0005539504 python3.9[81378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397760.5410364-801-271453759229176/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d5ce9fc6df7543706791229321e0116a703016b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:22 np0005539504 python3.9[81530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:22 np0005539504 python3.9[81653]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397761.706593-801-250091657805627/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b8a02358ffe5394259508f80366dde79a00a6dc5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:24 np0005539504 python3.9[81807]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:24 np0005539504 python3.9[81959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:25 np0005539504 python3.9[82082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397764.471051-991-103678698529460/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:26 np0005539504 python3.9[82234]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:26 np0005539504 python3.9[82386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:27 np0005539504 python3.9[82509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397766.4090796-1059-159355432571489/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:28 np0005539504 python3.9[82661]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:29 np0005539504 python3.9[82813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:29 np0005539504 python3.9[82936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397768.6040437-1131-184864510871585/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:30 np0005539504 python3.9[83088]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:31 np0005539504 python3.9[83240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:31 np0005539504 python3.9[83363]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397770.51362-1203-161473079356624/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:32 np0005539504 python3.9[83515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:33 np0005539504 python3.9[83667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:33 np0005539504 python3.9[83790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397772.8130016-1284-281436423897617/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:34 np0005539504 python3.9[83942]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:35 np0005539504 python3.9[84094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:35 np0005539504 python3.9[84217]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397774.8695202-1333-255346981772406/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:36 np0005539504 python3.9[84371]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:37 np0005539504 python3.9[84523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:37 np0005539504 python3.9[84646]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397776.807494-1357-209896160761172/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:38 np0005539504 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 01:29:38 np0005539504 systemd[1]: session-19.scope: Consumed 30.020s CPU time.
Nov 29 01:29:38 np0005539504 systemd-logind[783]: Session 19 logged out. Waiting for processes to exit.
Nov 29 01:29:38 np0005539504 systemd-logind[783]: Removed session 19.
Nov 29 01:29:43 np0005539504 systemd-logind[783]: New session 20 of user zuul.
Nov 29 01:29:43 np0005539504 systemd[1]: Started Session 20 of User zuul.
Nov 29 01:29:44 np0005539504 python3.9[84824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:45 np0005539504 python3.9[84980]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:46 np0005539504 python3.9[85132]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:47 np0005539504 python3.9[85282]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:49 np0005539504 python3.9[85436]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:29:53 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 01:29:54 np0005539504 python3.9[85592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:29:54 np0005539504 python3.9[85676]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:30:00 np0005539504 python3.9[85829]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:30:01 np0005539504 python3[85986]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 01:30:02 np0005539504 python3.9[86140]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:03 np0005539504 python3.9[86292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:03 np0005539504 python3.9[86370]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:04 np0005539504 python3.9[86524]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:05 np0005539504 python3.9[86602]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5ovxk4sh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:06 np0005539504 python3.9[86754]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:06 np0005539504 python3.9[86832]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:07 np0005539504 python3.9[86984]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:08 np0005539504 python3[87137]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:30:09 np0005539504 python3.9[87289]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:09 np0005539504 python3.9[87414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.5548835-436-152131156134823/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:10 np0005539504 python3.9[87566]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:11 np0005539504 python3.9[87691]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.1240811-481-95004292961308/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:14 np0005539504 python3.9[87843]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:15 np0005539504 python3.9[87968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397813.696288-526-22325941392884/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:16 np0005539504 python3.9[88120]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:16 np0005539504 python3.9[88245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397815.280292-571-208530795363592/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:17 np0005539504 python3.9[88397]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:18 np0005539504 python3.9[88522]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397817.141675-616-52876676897479/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:19 np0005539504 python3.9[88674]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:20 np0005539504 python3.9[88826]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:21 np0005539504 python3.9[88981]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:22 np0005539504 python3.9[89135]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:23 np0005539504 python3.9[89288]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:24 np0005539504 python3.9[89442]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:24 np0005539504 python3.9[89597]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:26 np0005539504 python3.9[89747]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:30:27 np0005539504 python3.9[89900]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:27 np0005539504 ovs-vsctl[89901]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 01:30:28 np0005539504 python3.9[90053]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:29 np0005539504 python3.9[90208]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:29 np0005539504 ovs-vsctl[90209]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 01:30:30 np0005539504 python3.9[90359]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:30 np0005539504 python3.9[90513]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:31 np0005539504 python3.9[90665]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:31 np0005539504 python3.9[90743]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:32 np0005539504 python3.9[90895]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:33 np0005539504 python3.9[90973]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:33 np0005539504 python3.9[91127]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:34 np0005539504 python3.9[91279]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:35 np0005539504 python3.9[91357]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:35 np0005539504 python3.9[91509]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:36 np0005539504 python3.9[91587]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:37 np0005539504 python3.9[91739]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:37 np0005539504 systemd[1]: Reloading.
Nov 29 01:30:38 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:38 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:39 np0005539504 python3.9[91928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:39 np0005539504 python3.9[92006]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:40 np0005539504 python3.9[92158]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:40 np0005539504 python3.9[92236]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:41 np0005539504 python3.9[92388]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:41 np0005539504 systemd[1]: Reloading.
Nov 29 01:30:41 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:41 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:42 np0005539504 systemd[1]: Starting Create netns directory...
Nov 29 01:30:42 np0005539504 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:30:42 np0005539504 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:30:42 np0005539504 systemd[1]: Finished Create netns directory.
Nov 29 01:30:42 np0005539504 python3.9[92584]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:43 np0005539504 python3.9[92736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:44 np0005539504 python3.9[92859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397843.1188893-1369-37457477055730/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:45 np0005539504 python3.9[93011]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:45 np0005539504 python3.9[93163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:46 np0005539504 python3.9[93286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397845.4620588-1444-100648128875332/.source.json _original_basename=.rpfpu6f5 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:47 np0005539504 python3.9[93438]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:50 np0005539504 python3.9[93865]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 01:30:51 np0005539504 python3.9[94017]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:30:52 np0005539504 python3.9[94169]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:30:52 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:54 np0005539504 python3[94332]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:30:54 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:54 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:54 np0005539504 podman[94367]: 2025-11-29 06:30:54.608409174 +0000 UTC m=+0.023473281 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:55 np0005539504 podman[94367]: 2025-11-29 06:30:55.344110715 +0000 UTC m=+0.759174802 container create 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 01:30:55 np0005539504 python3[94332]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:55 np0005539504 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:56 np0005539504 python3.9[94558]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:57 np0005539504 python3.9[94712]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:57 np0005539504 python3.9[94788]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:58 np0005539504 python3.9[94939]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397858.0621684-1708-266686460651255/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:59 np0005539504 python3.9[95015]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:30:59 np0005539504 systemd[1]: Reloading.
Nov 29 01:30:59 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:59 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:00 np0005539504 python3.9[95126]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:00 np0005539504 systemd[1]: Reloading.
Nov 29 01:31:00 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:00 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:00 np0005539504 systemd[1]: Starting ovn_controller container...
Nov 29 01:31:00 np0005539504 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 01:31:00 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:31:00 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa7b6c696abd047717aeb8d0a523c9eb2576a966265c57747714ec28065e6559/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:31:00 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927.
Nov 29 01:31:00 np0005539504 podman[95167]: 2025-11-29 06:31:00.850942531 +0000 UTC m=+0.248466991 container init 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:31:00 np0005539504 ovn_controller[95182]: + sudo -E kolla_set_configs
Nov 29 01:31:00 np0005539504 podman[95167]: 2025-11-29 06:31:00.881535576 +0000 UTC m=+0.279060036 container start 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:31:00 np0005539504 edpm-start-podman-container[95167]: ovn_controller
Nov 29 01:31:00 np0005539504 systemd[1]: Created slice User Slice of UID 0.
Nov 29 01:31:00 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 01:31:00 np0005539504 edpm-start-podman-container[95166]: Creating additional drop-in dependency for "ovn_controller" (35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927)
Nov 29 01:31:00 np0005539504 podman[95188]: 2025-11-29 06:31:00.976178179 +0000 UTC m=+0.082471147 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:31:00 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 01:31:00 np0005539504 systemd[1]: Starting User Manager for UID 0...
Nov 29 01:31:00 np0005539504 systemd[1]: 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927-37ea4b925b9f8f06.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:31:00 np0005539504 systemd[1]: 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927-37ea4b925b9f8f06.service: Failed with result 'exit-code'.
Nov 29 01:31:00 np0005539504 systemd[1]: Reloading.
Nov 29 01:31:01 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:01 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:01 np0005539504 systemd[95231]: Queued start job for default target Main User Target.
Nov 29 01:31:01 np0005539504 systemd[95231]: Created slice User Application Slice.
Nov 29 01:31:01 np0005539504 systemd[95231]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 01:31:01 np0005539504 systemd[95231]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:31:01 np0005539504 systemd[95231]: Reached target Paths.
Nov 29 01:31:01 np0005539504 systemd[95231]: Reached target Timers.
Nov 29 01:31:01 np0005539504 systemd[95231]: Starting D-Bus User Message Bus Socket...
Nov 29 01:31:01 np0005539504 systemd[95231]: Starting Create User's Volatile Files and Directories...
Nov 29 01:31:01 np0005539504 systemd[95231]: Finished Create User's Volatile Files and Directories.
Nov 29 01:31:01 np0005539504 systemd[95231]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:31:01 np0005539504 systemd[95231]: Reached target Sockets.
Nov 29 01:31:01 np0005539504 systemd[95231]: Reached target Basic System.
Nov 29 01:31:01 np0005539504 systemd[95231]: Reached target Main User Target.
Nov 29 01:31:01 np0005539504 systemd[95231]: Startup finished in 143ms.
Nov 29 01:31:01 np0005539504 systemd[1]: Started User Manager for UID 0.
Nov 29 01:31:01 np0005539504 systemd[1]: Started ovn_controller container.
Nov 29 01:31:01 np0005539504 systemd[1]: Started Session c1 of User root.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: INFO:__main__:Validating config file
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: INFO:__main__:Writing out command to execute
Nov 29 01:31:01 np0005539504 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: ++ cat /run_command
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + ARGS=
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + sudo kolla_copy_cacerts
Nov 29 01:31:01 np0005539504 systemd[1]: Started Session c2 of User root.
Nov 29 01:31:01 np0005539504 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + [[ ! -n '' ]]
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + . kolla_extend_start
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + umask 0022
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.4373] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.4383] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.4395] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.4399] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.4401] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:31:01 np0005539504 kernel: br-int: entered promiscuous mode
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 01:31:01 np0005539504 systemd-udevd[95331]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:01Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.6102] manager: (ovn-cdd09c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.6107] manager: (ovn-bd30a8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.6111] manager: (ovn-7525db-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 01:31:01 np0005539504 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 01:31:01 np0005539504 systemd-udevd[95343]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.6318] device (genev_sys_6081): carrier: link connected
Nov 29 01:31:01 np0005539504 NetworkManager[55210]: <info>  [1764397861.6322] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 01:31:02 np0005539504 python3.9[95451]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:31:02 np0005539504 ovs-vsctl[95452]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 01:31:02 np0005539504 python3.9[95604]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:31:02 np0005539504 ovs-vsctl[95606]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 01:31:03 np0005539504 python3.9[95759]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:31:03 np0005539504 ovs-vsctl[95760]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 01:31:04 np0005539504 systemd[1]: session-20.scope: Deactivated successfully.
Nov 29 01:31:04 np0005539504 systemd[1]: session-20.scope: Consumed 47.706s CPU time.
Nov 29 01:31:04 np0005539504 systemd-logind[783]: Session 20 logged out. Waiting for processes to exit.
Nov 29 01:31:04 np0005539504 systemd-logind[783]: Removed session 20.
Nov 29 01:31:10 np0005539504 systemd-logind[783]: New session 22 of user zuul.
Nov 29 01:31:10 np0005539504 systemd[1]: Started Session 22 of User zuul.
Nov 29 01:31:11 np0005539504 systemd[1]: Stopping User Manager for UID 0...
Nov 29 01:31:11 np0005539504 systemd[95231]: Activating special unit Exit the Session...
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped target Main User Target.
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped target Basic System.
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped target Paths.
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped target Sockets.
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped target Timers.
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:31:11 np0005539504 systemd[95231]: Closed D-Bus User Message Bus Socket.
Nov 29 01:31:11 np0005539504 systemd[95231]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:31:11 np0005539504 systemd[95231]: Removed slice User Application Slice.
Nov 29 01:31:11 np0005539504 systemd[95231]: Reached target Shutdown.
Nov 29 01:31:11 np0005539504 systemd[95231]: Finished Exit the Session.
Nov 29 01:31:11 np0005539504 systemd[95231]: Reached target Exit the Session.
Nov 29 01:31:11 np0005539504 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 01:31:11 np0005539504 systemd[1]: Stopped User Manager for UID 0.
Nov 29 01:31:11 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 01:31:11 np0005539504 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 01:31:11 np0005539504 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 01:31:11 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 01:31:11 np0005539504 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 01:31:11 np0005539504 python3.9[95948]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:31:13 np0005539504 python3.9[96106]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:13 np0005539504 python3.9[96258]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:14 np0005539504 python3.9[96410]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:15 np0005539504 python3.9[96562]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:15 np0005539504 python3.9[96714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:21 np0005539504 python3.9[96865]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:31:22 np0005539504 python3.9[97017]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:31:24 np0005539504 python3.9[97167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:24 np0005539504 python3.9[97288]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397883.6938407-224-183675240800343/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:25 np0005539504 python3.9[97438]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:26 np0005539504 python3.9[97559]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397885.2238107-269-112939252385356/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:27 np0005539504 python3.9[97711]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:31:28 np0005539504 python3.9[97795]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:31:31 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:31Z|00025|memory|INFO|16128 kB peak resident set size after 29.8 seconds
Nov 29 01:31:31 np0005539504 ovn_controller[95182]: 2025-11-29T06:31:31Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Nov 29 01:31:31 np0005539504 podman[97920]: 2025-11-29 06:31:31.236011557 +0000 UTC m=+0.122881576 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:31:31 np0005539504 python3.9[97965]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:31:32 np0005539504 python3.9[98127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:32 np0005539504 python3.9[98248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397891.85374-380-279208924751851/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:33 np0005539504 python3.9[98398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:34 np0005539504 python3.9[98519]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397893.0424533-380-105621519144488/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:35 np0005539504 python3.9[98669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:35 np0005539504 python3.9[98790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397894.8120193-512-129824543534072/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:36 np0005539504 python3.9[98942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:36 np0005539504 python3.9[99063]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397895.9958246-512-80496275617453/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:38 np0005539504 python3.9[99213]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:31:39 np0005539504 python3.9[99367]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:39 np0005539504 python3.9[99519]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:40 np0005539504 python3.9[99597]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:40 np0005539504 python3.9[99749]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:41 np0005539504 python3.9[99827]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:42 np0005539504 python3.9[99979]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:42 np0005539504 python3.9[100131]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:43 np0005539504 python3.9[100209]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:44 np0005539504 python3.9[100361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:44 np0005539504 python3.9[100439]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:45 np0005539504 python3.9[100591]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:45 np0005539504 systemd[1]: Reloading.
Nov 29 01:31:45 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:45 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:47 np0005539504 python3.9[100782]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:47 np0005539504 python3.9[100860]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:48 np0005539504 python3.9[101012]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:48 np0005539504 python3.9[101090]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:49 np0005539504 python3.9[101242]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:49 np0005539504 systemd[1]: Reloading.
Nov 29 01:31:49 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:49 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:50 np0005539504 systemd[1]: Starting Create netns directory...
Nov 29 01:31:50 np0005539504 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:31:50 np0005539504 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:31:50 np0005539504 systemd[1]: Finished Create netns directory.
Nov 29 01:31:50 np0005539504 python3.9[101436]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:51 np0005539504 python3.9[101588]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:52 np0005539504 python3.9[101711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397911.1686523-965-243833110429226/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:53 np0005539504 python3.9[101863]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:53 np0005539504 python3.9[102015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:54 np0005539504 python3.9[102138]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397913.4584298-1040-130097823318930/.source.json _original_basename=.cxdqvqs8 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:55 np0005539504 python3.9[102290]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:57 np0005539504 python3.9[102719]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 01:31:58 np0005539504 python3.9[102871]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:31:59 np0005539504 python3.9[103023]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:32:01 np0005539504 python3[103201]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:32:01 np0005539504 podman[103227]: 2025-11-29 06:32:01.799529663 +0000 UTC m=+0.135708944 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:32:12 np0005539504 podman[103214]: 2025-11-29 06:32:12.90958001 +0000 UTC m=+11.414733557 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:32:13 np0005539504 podman[103339]: 2025-11-29 06:32:13.042526962 +0000 UTC m=+0.025887404 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:32:13 np0005539504 podman[103339]: 2025-11-29 06:32:13.363727164 +0000 UTC m=+0.347087616 container create d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:32:13 np0005539504 python3[103201]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:32:16 np0005539504 python3.9[103529]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:17 np0005539504 python3.9[103685]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:18 np0005539504 python3.9[103763]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:18 np0005539504 python3.9[103914]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397938.218976-1304-184454922242435/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:19 np0005539504 python3.9[103990]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:19 np0005539504 systemd[1]: Reloading.
Nov 29 01:32:19 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:19 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:20 np0005539504 python3.9[104102]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:20 np0005539504 systemd[1]: Reloading.
Nov 29 01:32:20 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:20 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:20 np0005539504 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 01:32:20 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:32:20 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8bb436072fb5581bc2098cc7a705be43eaa10c543884f1f2095e2a5d24ddaaa/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:20 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8bb436072fb5581bc2098cc7a705be43eaa10c543884f1f2095e2a5d24ddaaa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:20 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7.
Nov 29 01:32:20 np0005539504 podman[104143]: 2025-11-29 06:32:20.837915126 +0000 UTC m=+0.136285909 container init d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + sudo -E kolla_set_configs
Nov 29 01:32:20 np0005539504 podman[104143]: 2025-11-29 06:32:20.875858868 +0000 UTC m=+0.174229671 container start d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:32:20 np0005539504 edpm-start-podman-container[104143]: ovn_metadata_agent
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Validating config file
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Copying service configuration files
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Writing out command to execute
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 01:32:20 np0005539504 edpm-start-podman-container[104142]: Creating additional drop-in dependency for "ovn_metadata_agent" (d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7)
Nov 29 01:32:20 np0005539504 podman[104166]: 2025-11-29 06:32:20.94781699 +0000 UTC m=+0.058306374 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: ++ cat /run_command
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + CMD=neutron-ovn-metadata-agent
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + ARGS=
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + sudo kolla_copy_cacerts
Nov 29 01:32:20 np0005539504 systemd[1]: Reloading.
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + [[ ! -n '' ]]
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + . kolla_extend_start
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + umask 0022
Nov 29 01:32:20 np0005539504 ovn_metadata_agent[104159]: + exec neutron-ovn-metadata-agent
Nov 29 01:32:21 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:21 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:21 np0005539504 systemd[1]: Started ovn_metadata_agent container.
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.839 104164 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.840 104164 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.840 104164 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.840 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.840 104164 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.840 104164 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.841 104164 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.842 104164 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.843 104164 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.844 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.845 104164 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.846 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.847 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.848 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.849 104164 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.850 104164 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.851 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.852 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.853 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.854 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.855 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.856 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.857 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.858 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.859 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.860 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.860 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.860 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.860 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.860 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.860 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.861 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.862 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.863 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.864 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.865 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.866 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.867 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.868 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.869 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.870 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.871 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.872 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.872 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.872 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.872 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.872 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.872 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.873 104164 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.883 104164 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.883 104164 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.883 104164 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.883 104164 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.884 104164 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.898 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name a43628b3-9efd-4940-9509-686038e16aeb (UUID: a43628b3-9efd-4940-9509-686038e16aeb) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.926 104164 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.926 104164 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.926 104164 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.927 104164 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.929 104164 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.935 104164 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.941 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'a43628b3-9efd-4940-9509-686038e16aeb'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], external_ids={}, name=a43628b3-9efd-4940-9509-686038e16aeb, nb_cfg_timestamp=1764397869471, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.942 104164 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f2a86320bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.943 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.944 104164 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.944 104164 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.944 104164 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.948 104164 DEBUG oslo_service.service [-] Started child 104269 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.952 104164 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpypzj5ghe/privsep.sock']#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.952 104269 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-438163'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.977 104269 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.978 104269 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.978 104269 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.982 104269 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.988 104269 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:22.995 104269 INFO eventlet.wsgi.server [-] (104269) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 29 01:32:23 np0005539504 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.705 104164 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.706 104164 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpypzj5ghe/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.529 104274 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.533 104274 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.535 104274 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.536 104274 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104274#033[00m
Nov 29 01:32:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:23.708 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[60e31b22-c654-4d44-8183-b78b35d6bf37]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:24.327 104274 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:32:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:24.328 104274 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:32:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:24.328 104274 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:32:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:24.898 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[f58b91b0-5e18-4d93-87e2-63e8f5787316]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:24.902 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, column=external_ids, values=({'neutron:ovn-metadata-id': 'c65a5970-e2f5-5bb0-b5cd-c8b5740f7b23'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.037 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.048 104164 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.048 104164 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.048 104164 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.048 104164 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.049 104164 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.049 104164 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.049 104164 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.049 104164 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.050 104164 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.050 104164 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.050 104164 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.051 104164 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.051 104164 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.051 104164 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.051 104164 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.052 104164 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.052 104164 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.052 104164 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.052 104164 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.053 104164 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.053 104164 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.053 104164 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.053 104164 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.054 104164 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.054 104164 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.054 104164 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.055 104164 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.055 104164 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.055 104164 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.055 104164 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.056 104164 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.056 104164 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.056 104164 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.057 104164 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.057 104164 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.057 104164 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.058 104164 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.058 104164 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.058 104164 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.059 104164 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.059 104164 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.059 104164 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.060 104164 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.060 104164 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.060 104164 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.060 104164 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.061 104164 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.061 104164 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.061 104164 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.061 104164 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.062 104164 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.062 104164 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.062 104164 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.062 104164 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.063 104164 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.063 104164 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.063 104164 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.063 104164 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.064 104164 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.064 104164 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.064 104164 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.064 104164 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.065 104164 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.065 104164 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.065 104164 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.065 104164 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.066 104164 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.066 104164 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.066 104164 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.066 104164 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.067 104164 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.067 104164 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.067 104164 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.067 104164 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.068 104164 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.068 104164 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.068 104164 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.068 104164 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.069 104164 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.069 104164 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.069 104164 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.069 104164 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.070 104164 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.070 104164 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.070 104164 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.070 104164 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.070 104164 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.071 104164 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.071 104164 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.071 104164 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.071 104164 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.072 104164 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.072 104164 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.072 104164 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.072 104164 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.073 104164 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.073 104164 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.073 104164 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.073 104164 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.074 104164 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.074 104164 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.074 104164 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.074 104164 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.074 104164 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.075 104164 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.075 104164 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.075 104164 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.075 104164 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.076 104164 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.076 104164 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.076 104164 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.077 104164 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.077 104164 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.077 104164 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.077 104164 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.078 104164 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.078 104164 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.078 104164 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.078 104164 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.079 104164 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.079 104164 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.079 104164 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.079 104164 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.080 104164 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.080 104164 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.080 104164 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.081 104164 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.081 104164 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.081 104164 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.081 104164 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.082 104164 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.082 104164 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.082 104164 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.082 104164 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.082 104164 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.083 104164 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.083 104164 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.083 104164 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.084 104164 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.084 104164 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.084 104164 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.084 104164 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.084 104164 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.085 104164 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.085 104164 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.085 104164 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.085 104164 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.086 104164 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.086 104164 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.086 104164 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.086 104164 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.086 104164 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.087 104164 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.087 104164 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.087 104164 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.087 104164 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.088 104164 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.088 104164 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.088 104164 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.088 104164 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.088 104164 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.089 104164 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.089 104164 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.089 104164 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.089 104164 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.090 104164 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.090 104164 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.090 104164 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.090 104164 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.090 104164 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.091 104164 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.091 104164 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.091 104164 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.092 104164 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.092 104164 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.092 104164 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.092 104164 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.093 104164 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.093 104164 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.093 104164 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.093 104164 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.094 104164 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.094 104164 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.094 104164 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.094 104164 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.095 104164 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.095 104164 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.095 104164 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.095 104164 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.096 104164 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.096 104164 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.096 104164 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.096 104164 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.097 104164 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.097 104164 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.097 104164 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.097 104164 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.098 104164 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.098 104164 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.098 104164 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.098 104164 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.099 104164 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.099 104164 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.099 104164 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.099 104164 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.099 104164 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.099 104164 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.100 104164 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.101 104164 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.102 104164 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.103 104164 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.104 104164 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.105 104164 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.106 104164 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.107 104164 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.108 104164 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.108 104164 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.108 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.108 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.108 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.108 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.109 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.110 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.111 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.112 104164 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.113 104164 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.113 104164 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:32:25.113 104164 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:26 np0005539504 systemd[1]: session-22.scope: Deactivated successfully.
Nov 29 01:32:26 np0005539504 systemd[1]: session-22.scope: Consumed 54.357s CPU time.
Nov 29 01:32:26 np0005539504 systemd-logind[783]: Session 22 logged out. Waiting for processes to exit.
Nov 29 01:32:26 np0005539504 systemd-logind[783]: Removed session 22.
Nov 29 01:32:32 np0005539504 systemd-logind[783]: New session 23 of user zuul.
Nov 29 01:32:32 np0005539504 systemd[1]: Started Session 23 of User zuul.
Nov 29 01:32:32 np0005539504 podman[104283]: 2025-11-29 06:32:32.563721738 +0000 UTC m=+0.157501924 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:32:33 np0005539504 python3.9[104462]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:32:35 np0005539504 python3.9[104618]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:32:37 np0005539504 python3.9[104783]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:37 np0005539504 systemd[1]: Reloading.
Nov 29 01:32:37 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:37 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:40 np0005539504 python3.9[104971]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:32:41 np0005539504 network[104988]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:32:41 np0005539504 network[104989]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:32:41 np0005539504 network[104990]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:32:47 np0005539504 python3.9[105251]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:48 np0005539504 python3.9[105406]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:49 np0005539504 python3.9[105559]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:50 np0005539504 python3.9[105712]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:51 np0005539504 podman[105837]: 2025-11-29 06:32:51.384583616 +0000 UTC m=+0.120836976 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 01:32:51 np0005539504 python3.9[105880]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:52 np0005539504 python3.9[106039]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:53 np0005539504 python3.9[106192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:56 np0005539504 python3.9[106345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:56 np0005539504 python3.9[106497]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:57 np0005539504 python3.9[106649]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:58 np0005539504 python3.9[106801]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:00 np0005539504 python3.9[106953]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:01 np0005539504 python3.9[107105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:01 np0005539504 python3.9[107257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:02 np0005539504 python3.9[107409]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:02 np0005539504 podman[107485]: 2025-11-29 06:33:02.750757745 +0000 UTC m=+0.094433455 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:33:03 np0005539504 python3.9[107585]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:03 np0005539504 python3.9[107737]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:04 np0005539504 python3.9[107889]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:05 np0005539504 python3.9[108042]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:05 np0005539504 python3.9[108194]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:06 np0005539504 python3.9[108346]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:07 np0005539504 python3.9[108498]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:08 np0005539504 python3.9[108650]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:33:09 np0005539504 python3.9[108802]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:33:09 np0005539504 systemd[1]: Reloading.
Nov 29 01:33:09 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:33:09 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:33:10 np0005539504 python3.9[108990]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:11 np0005539504 python3.9[109145]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:11 np0005539504 python3.9[109298]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:12 np0005539504 python3.9[109451]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:13 np0005539504 python3.9[109604]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:13 np0005539504 python3.9[109757]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:14 np0005539504 python3.9[109910]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:16 np0005539504 python3.9[110063]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 01:33:17 np0005539504 python3.9[110216]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:33:19 np0005539504 python3.9[110376]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:33:20 np0005539504 python3.9[110536]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:33:21 np0005539504 python3.9[110620]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:33:21 np0005539504 podman[110622]: 2025-11-29 06:33:21.792240998 +0000 UTC m=+0.087803977 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:33:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:33:22.886 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:33:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:33:22.889 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:33:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:33:22.889 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:33:33 np0005539504 podman[110668]: 2025-11-29 06:33:33.856734738 +0000 UTC m=+0.185700553 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:33:52 np0005539504 podman[110859]: 2025-11-29 06:33:52.83924181 +0000 UTC m=+0.089644671 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:33:59 np0005539504 kernel: SELinux:  Converting 2756 SID table entries...
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:33:59 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:34:04 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 29 01:34:04 np0005539504 podman[110891]: 2025-11-29 06:34:04.788623801 +0000 UTC m=+0.252833481 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 29 01:34:10 np0005539504 kernel: SELinux:  Converting 2756 SID table entries...
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:34:10 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:34:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:34:22.888 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:34:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:34:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:34:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:34:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:34:23 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 01:34:23 np0005539504 podman[111301]: 2025-11-29 06:34:23.771649342 +0000 UTC m=+0.085059416 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:34:35 np0005539504 podman[118507]: 2025-11-29 06:34:35.791647877 +0000 UTC m=+0.127982723 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 01:34:54 np0005539504 podman[127775]: 2025-11-29 06:34:54.760764399 +0000 UTC m=+0.082570701 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 01:35:06 np0005539504 podman[127800]: 2025-11-29 06:35:06.779821675 +0000 UTC m=+0.102135067 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 29 01:35:10 np0005539504 kernel: SELinux:  Converting 2757 SID table entries...
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:35:10 np0005539504 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:35:13 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:35:13 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 01:35:13 np0005539504 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Nov 29 01:35:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:35:22.888 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:35:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:35:22.890 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:35:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:35:22.890 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:35:24 np0005539504 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 01:35:24 np0005539504 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 01:35:24 np0005539504 systemd[1]: sshd.service: Unit process 110917 (sshd-session) remains running after unit stopped.
Nov 29 01:35:24 np0005539504 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 01:35:24 np0005539504 systemd[1]: sshd.service: Consumed 6.080s CPU time, 15.4M memory peak, read 32.0K from disk, written 304.0K to disk.
Nov 29 01:35:24 np0005539504 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 01:35:24 np0005539504 systemd[1]: Stopping sshd-keygen.target...
Nov 29 01:35:24 np0005539504 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:24 np0005539504 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:24 np0005539504 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:24 np0005539504 systemd[1]: Reached target sshd-keygen.target.
Nov 29 01:35:24 np0005539504 systemd[1]: Starting OpenSSH server daemon...
Nov 29 01:35:24 np0005539504 systemd[1]: Started OpenSSH server daemon.
Nov 29 01:35:24 np0005539504 podman[128684]: 2025-11-29 06:35:24.940209855 +0000 UTC m=+0.093529077 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:35:26 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:35:26 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:35:26 np0005539504 systemd[1]: Reloading.
Nov 29 01:35:26 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:35:26 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:35:26 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:35:36 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:35:36 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:35:36 np0005539504 systemd[1]: man-db-cache-update.service: Consumed 10.772s CPU time.
Nov 29 01:35:36 np0005539504 systemd[1]: run-reb2d8458c31d42dc8d3930bab1cc974b.service: Deactivated successfully.
Nov 29 01:35:37 np0005539504 podman[137278]: 2025-11-29 06:35:37.770993028 +0000 UTC m=+0.106412413 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:35:55 np0005539504 podman[137307]: 2025-11-29 06:35:55.728499491 +0000 UTC m=+0.064140688 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:36:08 np0005539504 podman[137328]: 2025-11-29 06:36:08.750423793 +0000 UTC m=+0.092218973 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:36:16 np0005539504 python3.9[137481]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:16 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:16 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:16 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:17 np0005539504 python3.9[137670]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:17 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:17 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:17 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:18 np0005539504 python3.9[137859]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:18 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:18 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:18 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:20 np0005539504 python3.9[138049]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:20 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:20 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:20 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:21 np0005539504 python3.9[138239]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:21 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:21 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:21 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:22 np0005539504 python3.9[138428]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:22 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:36:22.890 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:36:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:36:22.892 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:36:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:36:22.892 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:36:22 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:22 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:23 np0005539504 python3.9[138619]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:24 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:24 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:24 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:25 np0005539504 python3.9[138808]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:25 np0005539504 python3.9[138963]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:26 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:26 np0005539504 podman[138965]: 2025-11-29 06:36:26.039100921 +0000 UTC m=+0.074051638 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 29 01:36:26 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:26 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:30 np0005539504 python3.9[139174]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:30 np0005539504 systemd[1]: Reloading.
Nov 29 01:36:30 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:30 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:30 np0005539504 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 01:36:30 np0005539504 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 01:36:31 np0005539504 python3.9[139369]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:32 np0005539504 python3.9[139524]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:33 np0005539504 python3.9[139679]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:34 np0005539504 python3.9[139836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:35 np0005539504 python3.9[139991]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:36 np0005539504 python3.9[140146]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:37 np0005539504 python3.9[140303]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:38 np0005539504 python3.9[140458]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:39 np0005539504 podman[140585]: 2025-11-29 06:36:39.243476241 +0000 UTC m=+0.116188625 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:36:39 np0005539504 python3.9[140623]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:40 np0005539504 python3.9[140794]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:41 np0005539504 python3.9[140949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:42 np0005539504 python3.9[141104]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:42 np0005539504 python3.9[141259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:43 np0005539504 python3.9[141414]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:44 np0005539504 python3.9[141569]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:45 np0005539504 python3.9[141721]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:46 np0005539504 python3.9[141873]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:46 np0005539504 python3.9[142025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:47 np0005539504 python3.9[142179]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:48 np0005539504 python3.9[142331]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:49 np0005539504 python3.9[142483]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:49 np0005539504 python3.9[142608]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398208.4179704-1628-87958953827451/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:50 np0005539504 python3.9[142760]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:51 np0005539504 python3.9[142885]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398209.9951613-1628-244781421792280/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:51 np0005539504 python3.9[143037]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:52 np0005539504 python3.9[143162]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398211.4101024-1628-212666755149165/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:53 np0005539504 python3.9[143314]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:53 np0005539504 python3.9[143439]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398212.7126813-1628-116706285043679/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:54 np0005539504 python3.9[143591]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:55 np0005539504 python3.9[143716]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398213.9731379-1628-138572105101942/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:55 np0005539504 python3.9[143868]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:56 np0005539504 python3.9[143993]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398215.4053714-1628-243895625236595/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:56 np0005539504 podman[143994]: 2025-11-29 06:36:56.538829136 +0000 UTC m=+0.057920851 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 01:36:57 np0005539504 python3.9[144164]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:57 np0005539504 python3.9[144287]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398216.6241462-1628-128658497627387/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:58 np0005539504 python3.9[144439]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:59 np0005539504 python3.9[144564]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398217.8275979-1628-222708039842733/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:59 np0005539504 python3.9[144716]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 01:37:00 np0005539504 python3.9[144869]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:01 np0005539504 python3.9[145021]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:02 np0005539504 python3.9[145173]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:03 np0005539504 python3.9[145325]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:03 np0005539504 python3.9[145477]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:04 np0005539504 python3.9[145629]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:04 np0005539504 python3.9[145781]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:05 np0005539504 python3.9[145933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:06 np0005539504 python3.9[146085]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:06 np0005539504 python3.9[146237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:07 np0005539504 python3.9[146389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:08 np0005539504 python3.9[146543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:08 np0005539504 python3.9[146695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:09 np0005539504 podman[146819]: 2025-11-29 06:37:09.612755963 +0000 UTC m=+0.129768373 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:37:09 np0005539504 python3.9[146866]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:10 np0005539504 python3.9[147024]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:11 np0005539504 python3.9[147147]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398229.9936323-2291-48844304472671/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:11 np0005539504 python3.9[147299]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:12 np0005539504 python3.9[147422]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398231.3651388-2291-184425558152870/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:13 np0005539504 python3.9[147574]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:13 np0005539504 python3.9[147697]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398232.7230005-2291-279874519514892/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:14 np0005539504 python3.9[147849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:14 np0005539504 python3.9[147972]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398233.9763782-2291-50725374556897/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:15 np0005539504 python3.9[148124]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:16 np0005539504 python3.9[148247]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398235.141005-2291-234871501748291/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:16 np0005539504 python3.9[148399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:17 np0005539504 python3.9[148522]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398236.2661283-2291-148999399539698/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:17 np0005539504 python3.9[148674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:18 np0005539504 python3.9[148797]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.4189913-2291-151804355495189/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:19 np0005539504 python3.9[148949]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:19 np0005539504 python3.9[149072]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.648234-2291-79862849964342/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:20 np0005539504 python3.9[149224]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:20 np0005539504 python3.9[149347]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398239.7890015-2291-224882018814867/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:21 np0005539504 python3.9[149499]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:22 np0005539504 python3.9[149622]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.0272105-2291-2783757196560/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:22 np0005539504 python3.9[149774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:37:22.891 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:37:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:37:22.892 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:37:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:37:22.893 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:37:23 np0005539504 python3.9[149897]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.2239351-2291-177448860373944/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:23 np0005539504 python3.9[150049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:24 np0005539504 python3.9[150172]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398243.348347-2291-103045181353512/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:25 np0005539504 python3.9[150324]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:25 np0005539504 python3.9[150447]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398244.505256-2291-69552803065518/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:26 np0005539504 python3.9[150599]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:26 np0005539504 podman[150636]: 2025-11-29 06:37:26.707492415 +0000 UTC m=+0.053246508 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:37:27 np0005539504 python3.9[150740]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398245.9509013-2291-264813020557252/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:27 np0005539504 python3.9[150890]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:37:29 np0005539504 python3.9[151047]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 01:37:35 np0005539504 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 01:37:36 np0005539504 python3.9[151203]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:36 np0005539504 python3.9[151355]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:37 np0005539504 python3.9[151507]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539504 python3.9[151659]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539504 python3.9[151811]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:39 np0005539504 podman[151836]: 2025-11-29 06:37:39.824059253 +0000 UTC m=+0.149116327 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:37:41 np0005539504 python3.9[151990]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:41 np0005539504 python3.9[152142]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:42 np0005539504 python3.9[152294]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:43 np0005539504 python3.9[152448]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:44 np0005539504 python3.9[152600]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:45 np0005539504 python3.9[152752]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:45 np0005539504 systemd[1]: Reloading.
Nov 29 01:37:45 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:45 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:46 np0005539504 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 01:37:46 np0005539504 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 01:37:46 np0005539504 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 01:37:46 np0005539504 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 01:37:46 np0005539504 systemd[1]: Starting libvirt logging daemon...
Nov 29 01:37:46 np0005539504 systemd[1]: Started libvirt logging daemon.
Nov 29 01:37:47 np0005539504 python3.9[152946]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:47 np0005539504 systemd[1]: Reloading.
Nov 29 01:37:47 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:47 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:47 np0005539504 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 01:37:47 np0005539504 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 01:37:47 np0005539504 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 01:37:47 np0005539504 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 01:37:47 np0005539504 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 01:37:47 np0005539504 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 01:37:47 np0005539504 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:37:47 np0005539504 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:37:48 np0005539504 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 01:37:48 np0005539504 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 01:37:48 np0005539504 python3.9[153163]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:48 np0005539504 systemd[1]: Reloading.
Nov 29 01:37:48 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:48 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:49 np0005539504 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 01:37:49 np0005539504 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 01:37:49 np0005539504 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 01:37:49 np0005539504 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 01:37:49 np0005539504 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:37:49 np0005539504 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:37:49 np0005539504 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 01:37:49 np0005539504 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 01:37:50 np0005539504 python3.9[153383]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:50 np0005539504 systemd[1]: Reloading.
Nov 29 01:37:50 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:50 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:50 np0005539504 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 01:37:50 np0005539504 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 01:37:50 np0005539504 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 01:37:50 np0005539504 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 01:37:50 np0005539504 setroubleshoot[153114]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 92f8b38d-fcac-4d39-946f-a02cb3b2105e
Nov 29 01:37:50 np0005539504 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 01:37:50 np0005539504 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 01:37:50 np0005539504 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 01:37:50 np0005539504 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 01:37:50 np0005539504 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 01:37:50 np0005539504 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 01:37:50 np0005539504 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:37:50 np0005539504 setroubleshoot[153114]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:37:50 np0005539504 setroubleshoot[153114]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 92f8b38d-fcac-4d39-946f-a02cb3b2105e
Nov 29 01:37:50 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:37:50 np0005539504 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:37:50 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:37:50 np0005539504 setroubleshoot[153114]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:37:51 np0005539504 python3.9[153599]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:51 np0005539504 systemd[1]: Reloading.
Nov 29 01:37:51 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:51 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:52 np0005539504 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 01:37:52 np0005539504 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 01:37:52 np0005539504 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 01:37:52 np0005539504 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 01:37:52 np0005539504 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 01:37:52 np0005539504 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 01:37:52 np0005539504 systemd[1]: Starting libvirt secret daemon...
Nov 29 01:37:52 np0005539504 systemd[1]: Started libvirt secret daemon.
Nov 29 01:37:53 np0005539504 python3.9[153817]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:54 np0005539504 python3.9[153969]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:37:56 np0005539504 python3.9[154121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:57 np0005539504 podman[154216]: 2025-11-29 06:37:57.0643521 +0000 UTC m=+0.097752609 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:37:57 np0005539504 python3.9[154254]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398275.213535-3326-44752389969596/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:58 np0005539504 python3.9[154416]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:58 np0005539504 python3.9[154568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:59 np0005539504 python3.9[154646]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:00 np0005539504 python3.9[154798]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:00 np0005539504 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 01:38:00 np0005539504 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.112s CPU time.
Nov 29 01:38:00 np0005539504 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 01:38:00 np0005539504 python3.9[154876]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.o7iu2p8y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:01 np0005539504 python3.9[155028]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:02 np0005539504 python3.9[155106]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:02 np0005539504 python3.9[155258]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:03 np0005539504 python3[155411]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:38:04 np0005539504 python3.9[155563]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:05 np0005539504 python3.9[155641]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:06 np0005539504 python3.9[155793]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:06 np0005539504 python3.9[155871]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:07 np0005539504 python3.9[156023]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:07 np0005539504 python3.9[156101]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:08 np0005539504 python3.9[156253]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:09 np0005539504 python3.9[156331]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:09 np0005539504 python3.9[156483]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:10 np0005539504 podman[156580]: 2025-11-29 06:38:10.425909985 +0000 UTC m=+0.114541601 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 01:38:10 np0005539504 python3.9[156622]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398289.3050754-3701-135962965122926/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:11 np0005539504 python3.9[156784]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:12 np0005539504 python3.9[156936]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:13 np0005539504 python3.9[157091]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:14 np0005539504 python3.9[157243]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:14 np0005539504 python3.9[157396]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:15 np0005539504 python3.9[157550]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:16 np0005539504 python3.9[157705]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:17 np0005539504 python3.9[157857]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:17 np0005539504 python3.9[157980]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398296.5670576-3917-143086622321765/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:18 np0005539504 python3.9[158132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:19 np0005539504 python3.9[158255]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398298.12439-3963-206649454330701/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:19 np0005539504 python3.9[158407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:20 np0005539504 python3.9[158530]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398299.4280138-4007-64508673556477/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:21 np0005539504 python3.9[158682]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:38:21 np0005539504 systemd[1]: Reloading.
Nov 29 01:38:21 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:21 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:21 np0005539504 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 01:38:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:38:22.892 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:38:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:38:22.893 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:38:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:38:22.893 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:38:22 np0005539504 python3.9[158873]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:38:22 np0005539504 systemd[1]: Reloading.
Nov 29 01:38:23 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:23 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:23 np0005539504 systemd[1]: Reloading.
Nov 29 01:38:23 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:23 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:24 np0005539504 systemd[1]: session-23.scope: Deactivated successfully.
Nov 29 01:38:24 np0005539504 systemd[1]: session-23.scope: Consumed 3min 42.645s CPU time.
Nov 29 01:38:24 np0005539504 systemd-logind[783]: Session 23 logged out. Waiting for processes to exit.
Nov 29 01:38:24 np0005539504 systemd-logind[783]: Removed session 23.
Nov 29 01:38:27 np0005539504 podman[158972]: 2025-11-29 06:38:27.733275294 +0000 UTC m=+0.067377044 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:38:30 np0005539504 systemd-logind[783]: New session 24 of user zuul.
Nov 29 01:38:30 np0005539504 systemd[1]: Started Session 24 of User zuul.
Nov 29 01:38:31 np0005539504 python3.9[159149]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:38:32 np0005539504 python3.9[159303]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:38:32 np0005539504 network[159320]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:38:32 np0005539504 network[159321]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:38:32 np0005539504 network[159322]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:38:37 np0005539504 python3.9[159593]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:38:38 np0005539504 python3.9[159677]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:38:40 np0005539504 podman[159681]: 2025-11-29 06:38:40.791706253 +0000 UTC m=+0.129292848 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:38:45 np0005539504 python3.9[159859]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:46 np0005539504 python3.9[160011]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:47 np0005539504 python3.9[160164]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:47 np0005539504 python3.9[160318]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:48 np0005539504 python3.9[160471]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:49 np0005539504 python3.9[160594]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.2101161-251-219559428094332/.source.iscsi _original_basename=.fmi47c44 follow=False checksum=6265dce726b8ec925e1fe43f99cfec4137e10adb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:50 np0005539504 python3.9[160746]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:51 np0005539504 python3.9[160898]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:52 np0005539504 python3.9[161050]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:38:52 np0005539504 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 01:38:53 np0005539504 python3.9[161206]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:38:53 np0005539504 systemd[1]: Reloading.
Nov 29 01:38:53 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:53 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:53 np0005539504 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:38:53 np0005539504 systemd[1]: Starting Open-iSCSI...
Nov 29 01:38:53 np0005539504 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 01:38:53 np0005539504 systemd[1]: Started Open-iSCSI.
Nov 29 01:38:53 np0005539504 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 01:38:53 np0005539504 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 01:38:55 np0005539504 python3.9[161409]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:38:55 np0005539504 network[161426]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:38:55 np0005539504 network[161427]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:38:55 np0005539504 network[161428]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:38:57 np0005539504 podman[161518]: 2025-11-29 06:38:57.910255583 +0000 UTC m=+0.112373697 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:38:59 np0005539504 python3.9[161720]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:39:00 np0005539504 python3.9[161872]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 01:39:01 np0005539504 python3.9[162028]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:01 np0005539504 python3.9[162151]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398340.7822952-482-26475566901268/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:02 np0005539504 python3.9[162303]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:03 np0005539504 python3.9[162455]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:03 np0005539504 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:39:03 np0005539504 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:39:03 np0005539504 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:39:03 np0005539504 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:39:03 np0005539504 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:39:04 np0005539504 python3.9[162611]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:05 np0005539504 python3.9[162763]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:06 np0005539504 python3.9[162915]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:06 np0005539504 python3.9[163067]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:07 np0005539504 python3.9[163190]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398346.3268998-656-226423603875048/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:08 np0005539504 python3.9[163342]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:08 np0005539504 python3.9[163497]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:09 np0005539504 python3.9[163651]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:10 np0005539504 python3.9[163803]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:11 np0005539504 podman[163927]: 2025-11-29 06:39:11.280694012 +0000 UTC m=+0.118933333 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:39:11 np0005539504 python3.9[163976]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:12 np0005539504 python3.9[164135]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:12 np0005539504 python3.9[164287]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:13 np0005539504 python3.9[164439]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:14 np0005539504 python3.9[164593]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:15 np0005539504 python3.9[164747]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:15 np0005539504 python3.9[164899]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:16 np0005539504 python3.9[165051]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:17 np0005539504 python3.9[165129]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:17 np0005539504 python3.9[165281]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:18 np0005539504 python3.9[165359]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:19 np0005539504 python3.9[165511]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:19 np0005539504 python3.9[165663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:20 np0005539504 python3.9[165741]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:21 np0005539504 python3.9[165893]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:21 np0005539504 python3.9[165971]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:22 np0005539504 python3.9[166123]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:22 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:22 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:22 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:39:22.893 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:39:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:39:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:39:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:39:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:39:23 np0005539504 python3.9[166312]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:23 np0005539504 python3.9[166390]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:24 np0005539504 python3.9[166542]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:25 np0005539504 python3.9[166620]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:26 np0005539504 python3.9[166772]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:26 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:26 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:26 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:26 np0005539504 systemd[1]: Starting Create netns directory...
Nov 29 01:39:26 np0005539504 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:39:26 np0005539504 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:39:26 np0005539504 systemd[1]: Finished Create netns directory.
Nov 29 01:39:27 np0005539504 python3.9[166966]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:28 np0005539504 podman[167090]: 2025-11-29 06:39:28.120573105 +0000 UTC m=+0.084328969 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:39:28 np0005539504 python3.9[167130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:28 np0005539504 python3.9[167259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.7344697-1277-208714004243903/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:29 np0005539504 python3.9[167411]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:30 np0005539504 python3.9[167563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:31 np0005539504 python3.9[167688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398370.1386154-1352-87943356511856/.source.json _original_basename=.6akagnh2 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:32 np0005539504 python3.9[167840]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:34 np0005539504 python3.9[168267]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 01:39:35 np0005539504 python3.9[168421]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:39:36 np0005539504 python3.9[168573]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:39:38 np0005539504 python3[168752]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:39:38 np0005539504 podman[168789]: 2025-11-29 06:39:38.886810358 +0000 UTC m=+0.089748743 container create 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:39:38 np0005539504 podman[168789]: 2025-11-29 06:39:38.842017694 +0000 UTC m=+0.044956129 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:39:38 np0005539504 python3[168752]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:39:39 np0005539504 python3.9[168979]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:40 np0005539504 python3.9[169133]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:41 np0005539504 python3.9[169209]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:41 np0005539504 podman[169262]: 2025-11-29 06:39:41.791214762 +0000 UTC m=+0.125019506 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 01:39:42 np0005539504 python3.9[169388]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398381.4847908-1616-139308766252172/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:42 np0005539504 python3.9[169464]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:39:42 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:42 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:42 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:43 np0005539504 python3.9[169575]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:43 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:43 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:43 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:44 np0005539504 systemd[1]: Starting multipathd container...
Nov 29 01:39:44 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:39:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3c17d81b6252c103841773e09792f70ab04ef14298b447a59f04908d292395e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3c17d81b6252c103841773e09792f70ab04ef14298b447a59f04908d292395e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:44 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.
Nov 29 01:39:44 np0005539504 podman[169615]: 2025-11-29 06:39:44.309020492 +0000 UTC m=+0.127463316 container init 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:39:44 np0005539504 multipathd[169630]: + sudo -E kolla_set_configs
Nov 29 01:39:44 np0005539504 podman[169615]: 2025-11-29 06:39:44.347883637 +0000 UTC m=+0.166326471 container start 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 29 01:39:44 np0005539504 multipathd[169630]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:39:44 np0005539504 multipathd[169630]: INFO:__main__:Validating config file
Nov 29 01:39:44 np0005539504 multipathd[169630]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:39:44 np0005539504 multipathd[169630]: INFO:__main__:Writing out command to execute
Nov 29 01:39:44 np0005539504 multipathd[169630]: ++ cat /run_command
Nov 29 01:39:44 np0005539504 multipathd[169630]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:39:44 np0005539504 multipathd[169630]: + ARGS=
Nov 29 01:39:44 np0005539504 multipathd[169630]: + sudo kolla_copy_cacerts
Nov 29 01:39:44 np0005539504 multipathd[169630]: + [[ ! -n '' ]]
Nov 29 01:39:44 np0005539504 multipathd[169630]: + . kolla_extend_start
Nov 29 01:39:44 np0005539504 multipathd[169630]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:39:44 np0005539504 multipathd[169630]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:39:44 np0005539504 multipathd[169630]: + umask 0022
Nov 29 01:39:44 np0005539504 multipathd[169630]: + exec /usr/sbin/multipathd -d
Nov 29 01:39:44 np0005539504 multipathd[169630]: 3847.400594 | --------start up--------
Nov 29 01:39:44 np0005539504 multipathd[169630]: 3847.400622 | read /etc/multipath.conf
Nov 29 01:39:44 np0005539504 multipathd[169630]: 3847.410092 | path checkers start up
Nov 29 01:39:44 np0005539504 podman[169615]: multipathd
Nov 29 01:39:44 np0005539504 systemd[1]: Started multipathd container.
Nov 29 01:39:44 np0005539504 podman[169637]: 2025-11-29 06:39:44.76888348 +0000 UTC m=+0.402872699 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:39:45 np0005539504 python3.9[169817]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:46 np0005539504 python3.9[169971]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:47 np0005539504 python3.9[170136]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:47 np0005539504 systemd[1]: Stopping multipathd container...
Nov 29 01:39:47 np0005539504 multipathd[169630]: 3850.144066 | exit (signal)
Nov 29 01:39:47 np0005539504 multipathd[169630]: 3850.145493 | --------shut down-------
Nov 29 01:39:47 np0005539504 systemd[1]: libpod-494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.scope: Deactivated successfully.
Nov 29 01:39:47 np0005539504 podman[170140]: 2025-11-29 06:39:47.240815835 +0000 UTC m=+0.087103429 container died 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:39:47 np0005539504 systemd[1]: 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d-3656cccd87830e91.timer: Deactivated successfully.
Nov 29 01:39:47 np0005539504 systemd[1]: Stopped /usr/bin/podman healthcheck run 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.
Nov 29 01:39:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d-userdata-shm.mount: Deactivated successfully.
Nov 29 01:39:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay-c3c17d81b6252c103841773e09792f70ab04ef14298b447a59f04908d292395e-merged.mount: Deactivated successfully.
Nov 29 01:39:47 np0005539504 podman[170140]: 2025-11-29 06:39:47.301489951 +0000 UTC m=+0.147777545 container cleanup 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:39:47 np0005539504 podman[170140]: multipathd
Nov 29 01:39:47 np0005539504 podman[170167]: multipathd
Nov 29 01:39:47 np0005539504 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 01:39:47 np0005539504 systemd[1]: Stopped multipathd container.
Nov 29 01:39:47 np0005539504 systemd[1]: Starting multipathd container...
Nov 29 01:39:47 np0005539504 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 01:39:47 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:39:47 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3c17d81b6252c103841773e09792f70ab04ef14298b447a59f04908d292395e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:47 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3c17d81b6252c103841773e09792f70ab04ef14298b447a59f04908d292395e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:47 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.
Nov 29 01:39:47 np0005539504 podman[170180]: 2025-11-29 06:39:47.52684245 +0000 UTC m=+0.125944413 container init 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 01:39:47 np0005539504 multipathd[170194]: + sudo -E kolla_set_configs
Nov 29 01:39:47 np0005539504 podman[170180]: 2025-11-29 06:39:47.560358673 +0000 UTC m=+0.159460636 container start 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:39:47 np0005539504 podman[170180]: multipathd
Nov 29 01:39:47 np0005539504 systemd[1]: Started multipathd container.
Nov 29 01:39:47 np0005539504 multipathd[170194]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:39:47 np0005539504 multipathd[170194]: INFO:__main__:Validating config file
Nov 29 01:39:47 np0005539504 multipathd[170194]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:39:47 np0005539504 multipathd[170194]: INFO:__main__:Writing out command to execute
Nov 29 01:39:47 np0005539504 multipathd[170194]: ++ cat /run_command
Nov 29 01:39:47 np0005539504 multipathd[170194]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:39:47 np0005539504 multipathd[170194]: + ARGS=
Nov 29 01:39:47 np0005539504 multipathd[170194]: + sudo kolla_copy_cacerts
Nov 29 01:39:47 np0005539504 podman[170201]: 2025-11-29 06:39:47.643744234 +0000 UTC m=+0.063776564 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd)
Nov 29 01:39:47 np0005539504 multipathd[170194]: + [[ ! -n '' ]]
Nov 29 01:39:47 np0005539504 multipathd[170194]: + . kolla_extend_start
Nov 29 01:39:47 np0005539504 multipathd[170194]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:39:47 np0005539504 multipathd[170194]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:39:47 np0005539504 multipathd[170194]: + umask 0022
Nov 29 01:39:47 np0005539504 multipathd[170194]: + exec /usr/sbin/multipathd -d
Nov 29 01:39:47 np0005539504 systemd[1]: 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d-7d9eb0713e908436.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:39:47 np0005539504 systemd[1]: 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d-7d9eb0713e908436.service: Failed with result 'exit-code'.
Nov 29 01:39:47 np0005539504 multipathd[170194]: 3850.598510 | --------start up--------
Nov 29 01:39:47 np0005539504 multipathd[170194]: 3850.598528 | read /etc/multipath.conf
Nov 29 01:39:47 np0005539504 multipathd[170194]: 3850.607494 | path checkers start up
Nov 29 01:39:48 np0005539504 python3.9[170388]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:49 np0005539504 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:39:49 np0005539504 python3.9[170541]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:39:50 np0005539504 python3.9[170693]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 01:39:50 np0005539504 kernel: Key type psk registered
Nov 29 01:39:50 np0005539504 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 01:39:51 np0005539504 python3.9[170857]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:51 np0005539504 python3.9[170980]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398390.6118672-1856-4494438159888/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:52 np0005539504 python3.9[171134]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:52 np0005539504 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 01:39:53 np0005539504 python3.9[171287]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:53 np0005539504 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:39:53 np0005539504 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:39:53 np0005539504 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:39:53 np0005539504 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:39:53 np0005539504 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:39:54 np0005539504 python3.9[171443]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:39:57 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:57 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:57 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:57 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:57 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:57 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:57 np0005539504 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 01:39:57 np0005539504 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 01:39:58 np0005539504 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:39:58 np0005539504 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:39:58 np0005539504 systemd[1]: Reloading.
Nov 29 01:39:58 np0005539504 podman[171580]: 2025-11-29 06:39:58.243369459 +0000 UTC m=+0.085743519 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:39:58 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:58 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:58 np0005539504 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:39:59 np0005539504 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:39:59 np0005539504 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:39:59 np0005539504 systemd[1]: man-db-cache-update.service: Consumed 1.844s CPU time.
Nov 29 01:39:59 np0005539504 systemd[1]: run-r16b14cab8b5e4c66b17f3141ec84256f.service: Deactivated successfully.
Nov 29 01:40:01 np0005539504 python3.9[172920]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:40:01 np0005539504 systemd[1]: Stopping Open-iSCSI...
Nov 29 01:40:01 np0005539504 iscsid[161246]: iscsid shutting down.
Nov 29 01:40:01 np0005539504 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 01:40:01 np0005539504 systemd[1]: Stopped Open-iSCSI.
Nov 29 01:40:01 np0005539504 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:40:01 np0005539504 systemd[1]: Starting Open-iSCSI...
Nov 29 01:40:01 np0005539504 systemd[1]: Started Open-iSCSI.
Nov 29 01:40:02 np0005539504 python3.9[173077]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:40:03 np0005539504 python3.9[173233]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:04 np0005539504 python3.9[173385]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:40:04 np0005539504 systemd[1]: Reloading.
Nov 29 01:40:04 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:04 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:05 np0005539504 python3.9[173570]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:40:05 np0005539504 network[173588]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:40:05 np0005539504 network[173589]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:40:05 np0005539504 network[173590]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:40:11 np0005539504 python3.9[173864]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:12 np0005539504 podman[173989]: 2025-11-29 06:40:12.101593719 +0000 UTC m=+0.115373762 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:40:12 np0005539504 python3.9[174030]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:13 np0005539504 python3.9[174197]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:13 np0005539504 python3.9[174350]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:14 np0005539504 python3.9[174503]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:15 np0005539504 python3.9[174656]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:16 np0005539504 python3.9[174809]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:17 np0005539504 python3.9[174962]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:18 np0005539504 podman[175087]: 2025-11-29 06:40:18.70088485 +0000 UTC m=+0.054751463 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 01:40:18 np0005539504 python3.9[175136]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:19 np0005539504 python3.9[175288]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:20 np0005539504 python3.9[175440]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:21 np0005539504 python3.9[175592]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:21 np0005539504 python3.9[175744]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:22 np0005539504 python3.9[175896]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:40:22.893 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:40:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:40:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:40:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:40:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:40:23 np0005539504 python3.9[176048]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:23 np0005539504 python3.9[176200]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:24 np0005539504 python3.9[176352]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:25 np0005539504 python3.9[176506]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:25 np0005539504 python3.9[176658]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:26 np0005539504 python3.9[176810]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:26 np0005539504 python3.9[176962]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:27 np0005539504 python3.9[177114]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:28 np0005539504 python3.9[177268]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:28 np0005539504 podman[177343]: 2025-11-29 06:40:28.745219343 +0000 UTC m=+0.064923646 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:40:29 np0005539504 python3.9[177439]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:29 np0005539504 python3.9[177591]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:30 np0005539504 python3.9[177743]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:40:31 np0005539504 python3.9[177895]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:40:31 np0005539504 systemd[1]: Reloading.
Nov 29 01:40:32 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:32 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:32 np0005539504 python3.9[178081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:33 np0005539504 python3.9[178234]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:34 np0005539504 python3.9[178387]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:34 np0005539504 python3.9[178540]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:35 np0005539504 python3.9[178693]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:36 np0005539504 python3.9[178846]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:37 np0005539504 python3.9[178999]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:37 np0005539504 python3.9[179152]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:39 np0005539504 python3.9[179305]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:40 np0005539504 python3.9[179457]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:40 np0005539504 python3.9[179609]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:41 np0005539504 python3.9[179763]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:42 np0005539504 python3.9[179915]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:42 np0005539504 podman[179916]: 2025-11-29 06:40:42.392367952 +0000 UTC m=+0.123944112 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 01:40:42 np0005539504 python3.9[180091]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:43 np0005539504 python3.9[180243]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:44 np0005539504 python3.9[180395]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:45 np0005539504 python3.9[180547]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:45 np0005539504 python3.9[180699]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:49 np0005539504 podman[180724]: 2025-11-29 06:40:49.756745653 +0000 UTC m=+0.091659939 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:40:50 np0005539504 python3.9[180871]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 01:40:51 np0005539504 python3.9[181024]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:40:52 np0005539504 python3.9[181182]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:40:54 np0005539504 systemd-logind[783]: New session 25 of user zuul.
Nov 29 01:40:54 np0005539504 systemd[1]: Started Session 25 of User zuul.
Nov 29 01:40:54 np0005539504 systemd[1]: session-25.scope: Deactivated successfully.
Nov 29 01:40:54 np0005539504 systemd-logind[783]: Session 25 logged out. Waiting for processes to exit.
Nov 29 01:40:54 np0005539504 systemd-logind[783]: Removed session 25.
Nov 29 01:40:55 np0005539504 python3.9[181370]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:55 np0005539504 python3.9[181491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398454.7135806-3419-27447731212563/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:56 np0005539504 python3.9[181641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:56 np0005539504 python3.9[181717]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:57 np0005539504 python3.9[181867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:58 np0005539504 python3.9[181988]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398457.0297604-3419-71605736849837/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:58 np0005539504 python3.9[182138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:59 np0005539504 podman[182233]: 2025-11-29 06:40:59.375130531 +0000 UTC m=+0.075312803 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:40:59 np0005539504 python3.9[182266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398458.3770506-3419-222162240486227/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:00 np0005539504 python3.9[182427]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:00 np0005539504 python3.9[182548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398459.6679175-3419-278677322933688/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:01 np0005539504 python3.9[182698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:01 np0005539504 python3.9[182819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398460.8378744-3419-34367578003378/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:02 np0005539504 python3.9[182971]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:03 np0005539504 python3.9[183123]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:04 np0005539504 python3.9[183275]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:04 np0005539504 python3.9[183427]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:05 np0005539504 python3.9[183550]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398464.300832-3740-136440451247390/.source _original_basename=.y8qcj365 follow=False checksum=7a2161c5b01923e4ed0fc06f5c09c7be3927b431 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 01:41:06 np0005539504 python3.9[183702]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:06 np0005539504 python3.9[183856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:07 np0005539504 python3.9[183977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398466.5501895-3819-275891118063704/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:08 np0005539504 python3.9[184127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:08 np0005539504 python3.9[184248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398467.77232-3863-131621593324221/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:09 np0005539504 python3.9[184400]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 01:41:10 np0005539504 python3.9[184552]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:41:11 np0005539504 python3[184704]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:41:11 np0005539504 podman[184743]: 2025-11-29 06:41:11.904302791 +0000 UTC m=+0.063499998 container create fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3)
Nov 29 01:41:11 np0005539504 podman[184743]: 2025-11-29 06:41:11.869178977 +0000 UTC m=+0.028376224 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:41:11 np0005539504 python3[184704]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 01:41:12 np0005539504 podman[184905]: 2025-11-29 06:41:12.767555821 +0000 UTC m=+0.163135867 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:41:12 np0005539504 python3.9[184950]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:14 np0005539504 python3.9[185111]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 01:41:14 np0005539504 python3.9[185263]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:41:15 np0005539504 python3[185415]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:41:16 np0005539504 podman[185450]: 2025-11-29 06:41:16.179665535 +0000 UTC m=+0.047900624 container create 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, container_name=nova_compute, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:41:16 np0005539504 podman[185450]: 2025-11-29 06:41:16.154126766 +0000 UTC m=+0.022361895 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:41:16 np0005539504 python3[185415]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 01:41:17 np0005539504 python3.9[185640]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:18 np0005539504 python3.9[185794]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:19 np0005539504 python3.9[185945]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398478.5660489-4139-243989229094736/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:19 np0005539504 python3.9[186021]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:19 np0005539504 systemd[1]: Reloading.
Nov 29 01:41:20 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:20 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:20 np0005539504 podman[186057]: 2025-11-29 06:41:20.373133642 +0000 UTC m=+0.069398596 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 01:41:20 np0005539504 python3.9[186152]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:20 np0005539504 systemd[1]: Reloading.
Nov 29 01:41:21 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:21 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:21 np0005539504 systemd[1]: Starting nova_compute container...
Nov 29 01:41:21 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:41:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539504 podman[186191]: 2025-11-29 06:41:21.338153216 +0000 UTC m=+0.095543389 container init 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 29 01:41:21 np0005539504 podman[186191]: 2025-11-29 06:41:21.356100363 +0000 UTC m=+0.113490436 container start 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 29 01:41:21 np0005539504 podman[186191]: nova_compute
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + sudo -E kolla_set_configs
Nov 29 01:41:21 np0005539504 systemd[1]: Started nova_compute container.
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Validating config file
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying service configuration files
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Writing out command to execute
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:21 np0005539504 nova_compute[186207]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:21 np0005539504 nova_compute[186207]: ++ cat /run_command
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + CMD=nova-compute
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + ARGS=
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + sudo kolla_copy_cacerts
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + [[ ! -n '' ]]
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + . kolla_extend_start
Nov 29 01:41:21 np0005539504 nova_compute[186207]: Running command: 'nova-compute'
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + umask 0022
Nov 29 01:41:21 np0005539504 nova_compute[186207]: + exec nova-compute
Nov 29 01:41:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:41:22.894 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:41:22.895 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:41:22.895 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:23 np0005539504 python3.9[186369]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.583 186211 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.584 186211 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.584 186211 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.585 186211 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.726 186211 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.751 186211 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:41:23 np0005539504 nova_compute[186207]: 2025-11-29 06:41:23.752 186211 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.285 186211 INFO nova.virt.driver [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:41:24 np0005539504 python3.9[186523]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.401 186211 INFO nova.compute.provider_config [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.417 186211 DEBUG oslo_concurrency.lockutils [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.417 186211 DEBUG oslo_concurrency.lockutils [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.418 186211 DEBUG oslo_concurrency.lockutils [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.418 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.419 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.419 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.419 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.420 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.420 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.420 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.420 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.421 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.421 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.421 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.422 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.422 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.422 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.423 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.423 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.423 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.423 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.424 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.424 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.424 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.425 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.425 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.425 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.425 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.426 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.426 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.426 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.426 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.427 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.427 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.428 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.428 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.428 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.428 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.428 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.429 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.429 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.429 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.430 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.430 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.430 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.430 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.431 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.431 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.431 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.431 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.432 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.432 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.432 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.432 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.433 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.433 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.433 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.433 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.434 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.434 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.434 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.434 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.435 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.435 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.435 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.435 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.435 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.436 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.436 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.436 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.436 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.437 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.437 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.437 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.437 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.438 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.438 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.438 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.438 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.439 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.439 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.439 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.440 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.440 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.440 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.440 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.441 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.441 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.441 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.441 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.442 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.442 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.442 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.442 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.443 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.443 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.443 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.443 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.444 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.444 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.444 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.444 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.445 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.445 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.445 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.445 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.446 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.446 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.446 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.446 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.447 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.447 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.447 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.447 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.448 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.448 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.448 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.448 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.449 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.449 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.449 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.449 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.450 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.450 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.450 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.450 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.451 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.451 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.451 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.451 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.452 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.452 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.452 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.452 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.453 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.453 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.453 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.453 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.453 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.454 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.454 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.454 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.454 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.455 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.455 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.455 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.455 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.456 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.456 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.456 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.456 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.457 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.457 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.457 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.457 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.458 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.458 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.458 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.458 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.459 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.459 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.459 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.459 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.460 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.460 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.460 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.460 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.461 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.461 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.461 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.462 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.462 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.462 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.463 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.463 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.463 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.464 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.464 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.464 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.464 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.464 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.464 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.465 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.465 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.466 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.466 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.466 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.466 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.467 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.467 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.467 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.467 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.468 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.468 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.468 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.468 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.469 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.469 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.469 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.469 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.470 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.470 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.470 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.471 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.471 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.471 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.471 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.472 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.472 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.472 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.472 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.473 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.473 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.473 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.473 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.474 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.474 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.474 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.474 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.475 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.475 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.475 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.475 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.476 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.476 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.476 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.476 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.477 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.477 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.477 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.477 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.478 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.478 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.478 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.478 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.479 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.479 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.479 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.479 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.480 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.480 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.480 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.480 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.481 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.481 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.481 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.482 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.482 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.482 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.482 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.483 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.483 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.483 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.483 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.484 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.484 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.484 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.484 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.485 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.485 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.485 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.485 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.486 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.486 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.486 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.487 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.487 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.487 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.487 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.488 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.488 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.488 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.488 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.489 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.489 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.489 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.489 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.490 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.490 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.490 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.490 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.491 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.491 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.491 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.491 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.492 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.492 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.492 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.492 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.493 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.493 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.493 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.493 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.493 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.494 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.494 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.494 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.495 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.495 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.495 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.495 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.496 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.496 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.496 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.496 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.497 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.497 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.497 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.498 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.498 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.498 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.498 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.499 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.499 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.499 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.499 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.500 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.500 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.500 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.500 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.501 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.501 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.501 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.501 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.502 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.502 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.502 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.502 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.503 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.503 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.503 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.503 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.504 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.504 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.504 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.504 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.504 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.505 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.505 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.505 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.506 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.506 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.506 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.506 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.507 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.507 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.507 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.508 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.508 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.508 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.508 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.509 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.509 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.509 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.509 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.510 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.510 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.510 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.511 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.511 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.511 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.511 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.512 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.512 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.512 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.512 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.513 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.513 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.513 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.514 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.514 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.514 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.514 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.515 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.515 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.515 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.516 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.516 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.516 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.516 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.517 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.517 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.517 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.517 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.518 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.518 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.518 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.519 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.519 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.519 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.519 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.520 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.520 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.520 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.521 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.521 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.521 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.521 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.522 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.522 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.522 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.522 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.523 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.523 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.523 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.523 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.524 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.524 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.524 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.524 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.525 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.525 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.525 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.526 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.526 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.526 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.526 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.527 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.527 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.527 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.527 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.528 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.528 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.528 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.528 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.529 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.529 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.529 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.529 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.530 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.530 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.530 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.530 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.531 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.531 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.531 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.531 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.532 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.532 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.532 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.532 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.533 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.533 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.533 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.533 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.534 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.534 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.534 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.534 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.534 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.535 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.535 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.535 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.536 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.536 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.536 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.536 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.536 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.537 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.537 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.537 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.537 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.537 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.538 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.538 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.538 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.538 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.538 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.539 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.539 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.539 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.539 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.539 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.540 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.540 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.540 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.540 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.540 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.541 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.541 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.541 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.541 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.541 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.541 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.542 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.542 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.542 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.542 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.542 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.543 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.543 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.543 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.543 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.543 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.544 186211 WARNING oslo_config.cfg [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:41:24 np0005539504 nova_compute[186207]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:41:24 np0005539504 nova_compute[186207]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:41:24 np0005539504 nova_compute[186207]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:41:24 np0005539504 nova_compute[186207]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.544 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.544 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.544 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.545 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.545 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.545 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.545 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.545 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.546 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.546 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.546 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.546 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.546 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.547 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.547 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.547 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.547 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.547 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.547 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.548 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.548 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.548 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.548 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.548 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.549 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.549 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.549 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.549 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.549 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.550 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.550 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.550 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.550 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.550 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.551 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.551 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.551 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.551 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.551 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.552 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.552 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.552 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.552 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.552 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.553 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.553 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.553 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.553 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.553 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.553 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.554 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.554 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.554 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.554 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.554 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.555 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.555 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.555 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.555 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.555 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.555 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.556 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.556 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.556 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.556 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.556 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.556 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.557 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.557 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.557 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.557 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.557 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.558 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.558 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.558 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.558 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.558 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.558 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.559 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.559 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.559 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.559 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.559 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.560 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.560 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.560 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.560 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.560 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.561 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.561 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.561 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.561 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.561 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.561 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.562 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.562 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.562 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.562 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.562 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.562 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.563 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.563 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.563 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.563 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.563 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.564 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.564 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.564 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.564 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.564 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.565 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.565 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.565 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.565 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.565 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.566 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.566 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.566 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.566 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.566 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.567 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.567 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.567 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.567 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.567 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.568 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.568 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.568 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.568 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.568 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.568 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.569 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.569 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.569 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.569 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.569 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.570 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.570 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.570 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.570 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.571 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.571 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.571 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.571 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.571 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.572 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.572 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.572 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.572 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.573 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.573 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.573 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.573 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.574 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.574 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.574 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.574 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.574 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.575 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.575 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.575 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.575 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.575 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.576 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.576 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.576 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.576 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.576 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.576 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.577 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.577 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.577 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.577 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.577 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.578 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.578 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.578 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.578 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.578 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.578 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.579 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.580 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.581 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.581 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.581 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.581 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.581 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.581 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.582 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.583 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.584 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.585 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.586 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.587 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.588 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.589 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.589 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.589 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.589 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.589 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.589 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.590 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.591 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.592 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.593 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.594 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.595 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.596 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.597 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.598 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.598 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.598 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.598 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.598 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.598 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.599 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.600 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.601 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.601 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.601 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.601 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.601 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.601 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.602 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.603 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.604 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.605 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.606 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.607 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.608 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.609 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.610 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.611 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.612 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.613 186211 DEBUG oslo_service.service [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.614 186211 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.626 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.626 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.627 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.627 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:41:24 np0005539504 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:41:24 np0005539504 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.711 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc2276f6160> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.714 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc2276f6160> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.715 186211 INFO nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.735 186211 WARNING nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 29 01:41:24 np0005539504 nova_compute[186207]: 2025-11-29 06:41:24.736 186211 DEBUG nova.virt.libvirt.volume.mount [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:41:25 np0005539504 python3.9[186725]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.619 186211 INFO nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <host>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <uuid>73921493-fa29-46fa-8f9d-6eab83a1506e</uuid>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <arch>x86_64</arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <microcode version='16777317'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='x2apic'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='tsc-deadline'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='osxsave'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='hypervisor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='tsc_adjust'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='spec-ctrl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='stibp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='arch-capabilities'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='cmp_legacy'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='topoext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='virt-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='lbrv'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='tsc-scale'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='vmcb-clean'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='pause-filter'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='pfthreshold'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='rdctl-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='mds-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature name='pschange-mc-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <pages unit='KiB' size='4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <pages unit='KiB' size='2048'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <power_management>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <suspend_mem/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <suspend_disk/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <suspend_hybrid/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </power_management>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <iommu support='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <migration_features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <live/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <uri_transports>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <uri_transport>tcp</uri_transport>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <uri_transport>rdma</uri_transport>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </uri_transports>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </migration_features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <topology>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <cells num='1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <cell id='0'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          <memory unit='KiB'>7864320</memory>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          <distances>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <sibling id='0' value='10'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          </distances>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          <cpus num='8'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:          </cpus>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        </cell>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </cells>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </topology>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <cache>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </cache>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <secmodel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model>selinux</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <doi>0</doi>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </secmodel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <secmodel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model>dac</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <doi>0</doi>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </secmodel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </host>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <guest>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <os_type>hvm</os_type>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <arch name='i686'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <wordsize>32</wordsize>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <domain type='qemu'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <domain type='kvm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <pae/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <nonpae/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <apic default='on' toggle='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <cpuselection/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <deviceboot/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <externalSnapshot/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </guest>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <guest>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <os_type>hvm</os_type>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <arch name='x86_64'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <wordsize>64</wordsize>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <domain type='qemu'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <domain type='kvm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <apic default='on' toggle='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <cpuselection/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <deviceboot/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <externalSnapshot/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </guest>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 
Nov 29 01:41:25 np0005539504 nova_compute[186207]: </capabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: #033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.628 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.645 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:41:25 np0005539504 nova_compute[186207]: <domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <arch>i686</arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <vcpu max='240'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <os supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='firmware'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>rom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pflash</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>yes</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='secure'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </loader>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </os>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>memfd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </memoryBacking>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>disk</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>floppy</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>lun</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ide</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>fdc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>sata</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </disk>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vnc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </graphics>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <video supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vga</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>none</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>bochs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </video>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='mode'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>requisite</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>optional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pci</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hostdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>random</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </rng>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>path</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>handle</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </filesystem>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emulator</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>external</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>2.0</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </tpm>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </redirdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </channel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </crypto>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>passt</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </interface>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>isa</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </panic>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <console supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>null</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dev</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pipe</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stdio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>udp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tcp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </console>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='features'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vapic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>runtime</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>synic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stimer</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reset</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ipi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>avic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hyperv>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tdx</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </launchSecurity>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: </domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.651 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:41:25 np0005539504 nova_compute[186207]: <domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <arch>i686</arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <vcpu max='4096'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <os supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='firmware'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>rom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pflash</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>yes</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='secure'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </loader>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </os>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>memfd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </memoryBacking>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>disk</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>floppy</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>lun</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>fdc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>sata</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </disk>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vnc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </graphics>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <video supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vga</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>none</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>bochs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </video>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='mode'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>requisite</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>optional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pci</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hostdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>random</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </rng>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>path</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>handle</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </filesystem>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emulator</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>external</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>2.0</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </tpm>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </redirdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </channel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </crypto>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>passt</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </interface>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>isa</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </panic>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <console supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>null</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dev</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pipe</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stdio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>udp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tcp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </console>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='features'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vapic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>runtime</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>synic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stimer</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reset</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ipi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>avic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hyperv>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tdx</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </launchSecurity>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: </domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.683 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.686 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:41:25 np0005539504 nova_compute[186207]: <domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <arch>x86_64</arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <vcpu max='240'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <os supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='firmware'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>rom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pflash</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>yes</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='secure'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </loader>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </os>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>memfd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </memoryBacking>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>disk</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>floppy</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>lun</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ide</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>fdc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>sata</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </disk>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vnc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </graphics>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <video supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vga</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>none</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>bochs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </video>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='mode'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>requisite</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>optional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pci</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hostdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>random</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </rng>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>path</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>handle</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </filesystem>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emulator</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>external</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>2.0</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </tpm>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </redirdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </channel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </crypto>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>passt</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </interface>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>isa</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </panic>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <console supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>null</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dev</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pipe</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stdio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>udp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tcp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </console>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='features'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vapic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>runtime</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>synic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stimer</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reset</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ipi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>avic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hyperv>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tdx</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </launchSecurity>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: </domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.760 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:41:25 np0005539504 nova_compute[186207]: <domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <arch>x86_64</arch>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <vcpu max='4096'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <os supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='firmware'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>efi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>rom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pflash</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>yes</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='secure'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>yes</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>no</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </loader>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </os>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>on</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>off</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </blockers>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </mode>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <value>memfd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </memoryBacking>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>disk</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>floppy</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>lun</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>fdc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>sata</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </disk>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vnc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </graphics>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <video supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vga</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>none</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>bochs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </video>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='mode'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>requisite</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>optional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pci</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>scsi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hostdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>random</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>egd</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </rng>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>path</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>handle</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </filesystem>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emulator</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>external</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>2.0</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </tpm>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='bus'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>usb</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </redirdev>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </channel>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>builtin</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </crypto>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>default</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>passt</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </interface>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='model'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>isa</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </panic>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <console supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='type'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>null</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vc</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pty</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dev</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>file</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>pipe</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stdio</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>udp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tcp</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>unix</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>dbus</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </console>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </devices>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='features'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vapic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>runtime</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>synic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>stimer</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reset</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>ipi</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>avic</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </defaults>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </hyperv>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:        <value>tdx</value>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:      </enum>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:    </launchSecurity>
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  </features>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: </domainCapabilities>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.831 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.831 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.831 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.832 186211 INFO nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.836 186211 INFO nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.836 186211 INFO nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.846 186211 DEBUG nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:41:25 np0005539504 nova_compute[186207]:  <model>Nehalem</model>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: </cpu>
Nov 29 01:41:25 np0005539504 nova_compute[186207]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.849 186211 DEBUG nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.875 186211 INFO nova.virt.node [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Determined node identity 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from /var/lib/nova/compute_id#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.888 186211 WARNING nova.compute.manager [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Compute nodes ['1c526389-06f6-4ffd-8e90-a84c6c39f0bc'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 29 01:41:25 np0005539504 nova_compute[186207]: 2025-11-29 06:41:25.914 186211 INFO nova.compute.manager [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.027 186211 WARNING nova.compute.manager [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.027 186211 DEBUG oslo_concurrency.lockutils [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.028 186211 DEBUG oslo_concurrency.lockutils [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.028 186211 DEBUG oslo_concurrency.lockutils [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.028 186211 DEBUG nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:41:26 np0005539504 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:41:26 np0005539504 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.331 186211 WARNING nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.333 186211 DEBUG nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6208MB free_disk=73.54788970947266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.333 186211 DEBUG oslo_concurrency.lockutils [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.334 186211 DEBUG oslo_concurrency.lockutils [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.354 186211 WARNING nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] No compute node record for compute-1.ctlplane.example.com:1c526389-06f6-4ffd-8e90-a84c6c39f0bc: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 1c526389-06f6-4ffd-8e90-a84c6c39f0bc could not be found.#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.376 186211 INFO nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.424 186211 DEBUG nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.424 186211 DEBUG nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:41:26 np0005539504 python3.9[186912]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:41:26 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:26 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:26 np0005539504 nova_compute[186207]: 2025-11-29 06:41:26.979 186211 INFO nova.scheduler.client.report [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] [req-b53e28d2-fbf1-431f-a103-6869d4f7b63b] Created resource provider record via placement API for resource provider with UUID 1c526389-06f6-4ffd-8e90-a84c6c39f0bc and name compute-1.ctlplane.example.com.#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.016 186211 DEBUG nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 01:41:27 np0005539504 nova_compute[186207]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.016 186211 INFO nova.virt.libvirt.host [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.017 186211 DEBUG nova.compute.provider_tree [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.017 186211 DEBUG nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.019 186211 DEBUG nova.virt.libvirt.driver [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 01:41:27 np0005539504 nova_compute[186207]:  <arch>x86_64</arch>
Nov 29 01:41:27 np0005539504 nova_compute[186207]:  <model>Nehalem</model>
Nov 29 01:41:27 np0005539504 nova_compute[186207]:  <vendor>AMD</vendor>
Nov 29 01:41:27 np0005539504 nova_compute[186207]:  <topology sockets="8" cores="1" threads="1"/>
Nov 29 01:41:27 np0005539504 nova_compute[186207]: </cpu>
Nov 29 01:41:27 np0005539504 nova_compute[186207]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.106 186211 DEBUG nova.scheduler.client.report [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Updated inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.107 186211 DEBUG nova.compute.provider_tree [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Updating resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.107 186211 DEBUG nova.compute.provider_tree [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.244 186211 DEBUG nova.compute.provider_tree [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Updating resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.261 186211 DEBUG nova.compute.resource_tracker [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.261 186211 DEBUG oslo_concurrency.lockutils [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.262 186211 DEBUG nova.service [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.327 186211 DEBUG nova.service [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.327 186211 DEBUG nova.servicegroup.drivers.db [None req-9464d3fd-f5cb-4106-93d7-35386cbfd2be - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 29 01:41:27 np0005539504 python3.9[187088]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:41:27 np0005539504 systemd[1]: Stopping nova_compute container...
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.896 186211 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.898 186211 DEBUG oslo_concurrency.lockutils [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.898 186211 DEBUG oslo_concurrency.lockutils [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:41:27 np0005539504 nova_compute[186207]: 2025-11-29 06:41:27.898 186211 DEBUG oslo_concurrency.lockutils [None req-17557099-fba7-49e7-9e81-d0a36ad59a68 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:41:28 np0005539504 virtqemud[186569]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 01:41:28 np0005539504 virtqemud[186569]: hostname: compute-1
Nov 29 01:41:28 np0005539504 virtqemud[186569]: End of file while reading data: Input/output error
Nov 29 01:41:28 np0005539504 systemd[1]: libpod-0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1.scope: Deactivated successfully.
Nov 29 01:41:28 np0005539504 podman[187092]: 2025-11-29 06:41:28.386874369 +0000 UTC m=+0.753310680 container died 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:41:28 np0005539504 systemd[1]: libpod-0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1.scope: Consumed 3.525s CPU time.
Nov 29 01:41:28 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1-userdata-shm.mount: Deactivated successfully.
Nov 29 01:41:28 np0005539504 systemd[1]: var-lib-containers-storage-overlay-6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6-merged.mount: Deactivated successfully.
Nov 29 01:41:28 np0005539504 podman[187092]: 2025-11-29 06:41:28.457107996 +0000 UTC m=+0.823544287 container cleanup 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:41:28 np0005539504 podman[187092]: nova_compute
Nov 29 01:41:28 np0005539504 podman[187123]: nova_compute
Nov 29 01:41:28 np0005539504 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 01:41:28 np0005539504 systemd[1]: Stopped nova_compute container.
Nov 29 01:41:28 np0005539504 systemd[1]: Starting nova_compute container...
Nov 29 01:41:28 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:41:28 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:28 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:28 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:28 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:28 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ecd587472bcf3abc08a1b98e386ab3e2a722b7ec399d3a5b4d46ec4e9f2b8a6/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:28 np0005539504 podman[187136]: 2025-11-29 06:41:28.629730483 +0000 UTC m=+0.078992351 container init 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:41:28 np0005539504 podman[187136]: 2025-11-29 06:41:28.636805201 +0000 UTC m=+0.086067049 container start 0d1273dd61402bb215437cb577c50e3c6ba4b3546eee4e2712b353f4a1e817a1 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:41:28 np0005539504 podman[187136]: nova_compute
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + sudo -E kolla_set_configs
Nov 29 01:41:28 np0005539504 systemd[1]: Started nova_compute container.
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Validating config file
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying service configuration files
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Writing out command to execute
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:28 np0005539504 nova_compute[187152]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:28 np0005539504 nova_compute[187152]: ++ cat /run_command
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + CMD=nova-compute
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + ARGS=
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + sudo kolla_copy_cacerts
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + [[ ! -n '' ]]
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + . kolla_extend_start
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:41:28 np0005539504 nova_compute[187152]: Running command: 'nova-compute'
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + umask 0022
Nov 29 01:41:28 np0005539504 nova_compute[187152]: + exec nova-compute
Nov 29 01:41:29 np0005539504 podman[187189]: 2025-11-29 06:41:29.712440995 +0000 UTC m=+0.055564288 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.762 187156 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.763 187156 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.763 187156 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.763 187156 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.902 187156 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.926 187156 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:41:30 np0005539504 nova_compute[187152]: 2025-11-29 06:41:30.926 187156 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.350 187156 INFO nova.virt.driver [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.487 187156 INFO nova.compute.provider_config [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.678 187156 DEBUG oslo_concurrency.lockutils [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.679 187156 DEBUG oslo_concurrency.lockutils [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.679 187156 DEBUG oslo_concurrency.lockutils [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.679 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.679 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.679 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.680 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.681 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.681 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.681 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.681 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.681 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.681 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.682 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.683 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.684 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.685 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.685 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.685 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.685 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.685 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.686 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.687 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.688 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.689 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.690 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.691 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.692 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.693 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.694 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.695 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.695 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.695 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.695 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.695 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.695 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.696 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.697 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.698 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.699 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.700 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.700 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.700 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.700 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.700 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.700 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.701 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.701 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.701 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.701 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.701 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.701 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.702 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.702 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.702 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.702 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.702 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.702 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.703 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.704 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.704 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.704 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.704 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.704 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.704 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.705 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.705 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.705 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.705 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.705 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.705 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.706 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.706 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.706 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.706 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.706 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.706 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.707 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.708 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.709 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.709 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.709 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.709 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.709 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.709 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.710 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.711 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.712 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.713 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.714 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.715 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.715 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.715 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.715 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.715 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.715 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.716 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.716 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.716 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.716 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.716 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.716 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.717 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.718 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.718 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.718 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.718 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.718 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.718 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.719 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.720 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.721 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.721 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.721 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.721 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.721 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.721 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.722 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.723 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.724 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.724 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.724 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.724 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.724 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.724 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.725 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.726 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.727 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.727 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.727 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.727 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.727 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.727 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.728 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.728 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.728 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.728 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.728 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.729 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.730 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.730 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.730 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.730 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.730 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.730 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.731 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.732 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.732 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.732 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.732 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.732 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.733 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.734 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.734 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.734 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.734 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.734 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.734 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.735 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.735 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.735 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.735 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.735 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.735 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.736 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.737 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.737 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.737 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.737 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.737 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.738 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.738 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.738 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.738 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.738 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.738 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.739 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.739 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.739 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.739 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.739 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.740 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.741 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.742 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.743 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.744 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.745 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.745 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.745 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.745 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.745 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.745 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.746 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.747 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.747 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.747 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.747 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.747 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.748 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.749 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.749 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.749 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.749 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.749 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.749 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.750 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.751 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.751 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.751 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.751 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.751 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.751 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.752 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.753 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.753 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.753 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.753 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.753 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.753 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.754 187156 WARNING oslo_config.cfg [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:41:31 np0005539504 nova_compute[187152]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:41:31 np0005539504 nova_compute[187152]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:41:31 np0005539504 nova_compute[187152]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:41:31 np0005539504 nova_compute[187152]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.754 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.754 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.754 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.754 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.754 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.755 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.755 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.755 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.755 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.755 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.755 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.756 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.757 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.758 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.758 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.758 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.758 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.758 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.758 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.759 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.760 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.760 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.760 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.760 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.760 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.760 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.761 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.762 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.762 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.762 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.762 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.762 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.763 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.764 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.765 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.766 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.766 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.766 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.766 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.766 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.766 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.767 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.768 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.769 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.770 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.770 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.770 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.770 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.770 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.770 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.771 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.772 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.773 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.774 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.775 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.775 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.775 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.775 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.775 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.775 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.776 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.777 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.778 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.779 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.780 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.781 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.781 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.781 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.781 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.781 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.781 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.782 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.783 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.783 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.783 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.783 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.783 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.783 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.784 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.785 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.786 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.787 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.788 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.789 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.790 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.790 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.790 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.790 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.790 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.790 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.791 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.791 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.791 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.791 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.792 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.793 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.793 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.793 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.793 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.793 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.793 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.794 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.795 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.796 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.797 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.798 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.798 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.798 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.798 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.798 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.798 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.799 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.800 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.801 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.802 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.802 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.802 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.802 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.802 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.802 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.803 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.804 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.805 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.806 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.807 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.808 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.809 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.809 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.809 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.809 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.809 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.809 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.810 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.810 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.810 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.810 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.810 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.810 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.811 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.811 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.811 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.811 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.811 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.811 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.812 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.812 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.812 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.812 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.812 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.812 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.813 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.814 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.815 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.815 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.815 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.815 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.815 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.815 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.816 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.817 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.817 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.817 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.817 187156 DEBUG oslo_service.service [None req-14ad68c3-eeda-41d7-b3bc-d072a77cc9e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:41:31 np0005539504 nova_compute[187152]: 2025-11-29 06:41:31.818 187156 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:41:32 np0005539504 python3.9[187339]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.050 187156 INFO nova.virt.node [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Determined node identity 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from /var/lib/nova/compute_id#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.050 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.051 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.051 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.052 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.068 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc7e728e3a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.072 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc7e728e3a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.073 187156 INFO nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.085 187156 DEBUG nova.virt.libvirt.volume.mount [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.087 187156 INFO nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <host>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <uuid>73921493-fa29-46fa-8f9d-6eab83a1506e</uuid>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <arch>x86_64</arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model>EPYC-Rome-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <vendor>AMD</vendor>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <microcode version='16777317'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='x2apic'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='tsc-deadline'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='osxsave'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='hypervisor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='tsc_adjust'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='spec-ctrl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='stibp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='arch-capabilities'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='cmp_legacy'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='topoext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='virt-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='lbrv'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='tsc-scale'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='vmcb-clean'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='pause-filter'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='pfthreshold'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='svme-addr-chk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='rdctl-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='mds-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature name='pschange-mc-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <pages unit='KiB' size='4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <pages unit='KiB' size='2048'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <power_management>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <suspend_mem/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <suspend_disk/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <suspend_hybrid/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </power_management>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <iommu support='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <migration_features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <live/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <uri_transports>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <uri_transport>tcp</uri_transport>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <uri_transport>rdma</uri_transport>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </uri_transports>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </migration_features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <topology>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <cells num='1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <cell id='0'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          <memory unit='KiB'>7864320</memory>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          <distances>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <sibling id='0' value='10'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          </distances>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          <cpus num='8'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:          </cpus>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        </cell>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </cells>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </topology>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <cache>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </cache>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <secmodel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model>selinux</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <doi>0</doi>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </secmodel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <secmodel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model>dac</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <doi>0</doi>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </secmodel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </host>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <guest>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <os_type>hvm</os_type>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <arch name='i686'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <wordsize>32</wordsize>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <domain type='qemu'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <domain type='kvm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <pae/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <nonpae/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <apic default='on' toggle='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <cpuselection/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <deviceboot/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <externalSnapshot/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </guest>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <guest>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <os_type>hvm</os_type>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <arch name='x86_64'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <wordsize>64</wordsize>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <domain type='qemu'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <domain type='kvm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <apic default='on' toggle='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <cpuselection/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <deviceboot/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <externalSnapshot/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </guest>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 
Nov 29 01:41:32 np0005539504 nova_compute[187152]: </capabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: #033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.096 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.104 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:41:32 np0005539504 nova_compute[187152]: <domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <domain>kvm</domain>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <arch>i686</arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <vcpu max='4096'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <iothreads supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <os supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='firmware'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <loader supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>rom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pflash</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='readonly'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>yes</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='secure'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </loader>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='maximumMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <vendor>AMD</vendor>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='succor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='custom' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-128'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-256'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-512'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <memoryBacking supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='sourceType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>anonymous</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>memfd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </memoryBacking>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <disk supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='diskDevice'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>disk</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cdrom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>floppy</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>lun</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>fdc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>sata</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <graphics supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vnc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egl-headless</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <video supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='modelType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vga</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cirrus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>none</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>bochs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ramfb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hostdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='mode'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>subsystem</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='startupPolicy'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>mandatory</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>requisite</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>optional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='subsysType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pci</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='capsType'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='pciBackend'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hostdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <rng supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>random</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <filesystem supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='driverType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>path</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>handle</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtiofs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </filesystem>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <tpm supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-tis</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-crb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emulator</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>external</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendVersion'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>2.0</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </tpm>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <redirdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </redirdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <channel supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </channel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <crypto supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </crypto>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <interface supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>passt</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <panic supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>isa</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>hyperv</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </panic>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <console supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>null</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dev</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pipe</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stdio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>udp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tcp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu-vdagent</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </console>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <gic supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <genid supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backup supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <async-teardown supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <ps2 supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sev supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sgx supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hyperv supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='features'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>relaxed</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vapic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>spinlocks</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vpindex</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>runtime</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>synic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stimer</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reset</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vendor_id</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>frequencies</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reenlightenment</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tlbflush</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ipi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>avic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emsr_bitmap</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>xmm_input</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hyperv>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <launchSecurity supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='sectype'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tdx</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </launchSecurity>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: </domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.111 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:41:32 np0005539504 nova_compute[187152]: <domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <domain>kvm</domain>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <arch>i686</arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <vcpu max='240'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <iothreads supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <os supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='firmware'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <loader supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>rom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pflash</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='readonly'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>yes</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='secure'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </loader>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='maximumMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <vendor>AMD</vendor>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='succor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='custom' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-128'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-256'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-512'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <memoryBacking supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='sourceType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>anonymous</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>memfd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </memoryBacking>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <disk supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='diskDevice'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>disk</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cdrom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>floppy</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>lun</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ide</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>fdc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>sata</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <graphics supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vnc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egl-headless</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <video supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='modelType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vga</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cirrus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>none</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>bochs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ramfb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hostdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='mode'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>subsystem</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='startupPolicy'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>mandatory</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>requisite</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>optional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='subsysType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pci</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='capsType'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='pciBackend'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hostdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <rng supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>random</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <filesystem supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='driverType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>path</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>handle</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtiofs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </filesystem>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <tpm supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-tis</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-crb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emulator</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>external</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendVersion'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>2.0</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </tpm>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <redirdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </redirdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <channel supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </channel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <crypto supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </crypto>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <interface supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>passt</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <panic supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>isa</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>hyperv</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </panic>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <console supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>null</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dev</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pipe</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stdio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>udp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tcp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu-vdagent</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </console>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <gic supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <genid supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backup supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <async-teardown supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <ps2 supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sev supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sgx supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hyperv supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='features'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>relaxed</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vapic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>spinlocks</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vpindex</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>runtime</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>synic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stimer</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reset</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vendor_id</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>frequencies</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reenlightenment</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tlbflush</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ipi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>avic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emsr_bitmap</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>xmm_input</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hyperv>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <launchSecurity supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='sectype'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tdx</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </launchSecurity>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: </domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.162 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.166 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:41:32 np0005539504 nova_compute[187152]: <domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <domain>kvm</domain>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <arch>x86_64</arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <vcpu max='4096'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <iothreads supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <os supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='firmware'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>efi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <loader supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>rom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pflash</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='readonly'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>yes</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='secure'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>yes</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </loader>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='maximumMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <vendor>AMD</vendor>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='succor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='custom' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 systemd[1]: Started libpod-conmon-fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106.scope.
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-128'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-256'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-512'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f48e69025457eb338709f4cff51b676fe6e5881accb5e439ed807e3ca455ec9/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:32 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f48e69025457eb338709f4cff51b676fe6e5881accb5e439ed807e3ca455ec9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:32 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f48e69025457eb338709f4cff51b676fe6e5881accb5e439ed807e3ca455ec9/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <memoryBacking supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='sourceType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>anonymous</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>memfd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </memoryBacking>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <disk supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='diskDevice'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>disk</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cdrom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>floppy</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>lun</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>fdc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>sata</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <graphics supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vnc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egl-headless</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <video supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='modelType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vga</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cirrus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>none</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>bochs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ramfb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hostdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='mode'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>subsystem</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='startupPolicy'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>mandatory</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>requisite</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>optional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='subsysType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pci</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='capsType'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='pciBackend'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hostdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <rng supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>random</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <filesystem supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='driverType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>path</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>handle</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtiofs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </filesystem>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <tpm supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-tis</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-crb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emulator</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>external</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendVersion'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>2.0</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </tpm>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <redirdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </redirdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <channel supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </channel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <crypto supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </crypto>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <interface supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>passt</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <panic supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>isa</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>hyperv</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </panic>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <console supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>null</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dev</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pipe</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stdio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>udp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tcp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu-vdagent</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </console>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <gic supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <genid supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backup supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <async-teardown supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <ps2 supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sev supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sgx supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hyperv supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='features'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>relaxed</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vapic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>spinlocks</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vpindex</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>runtime</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>synic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stimer</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reset</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vendor_id</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>frequencies</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reenlightenment</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tlbflush</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ipi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>avic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emsr_bitmap</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>xmm_input</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hyperv>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <launchSecurity supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='sectype'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tdx</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </launchSecurity>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: </domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.239 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:41:32 np0005539504 nova_compute[187152]: <domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <domain>kvm</domain>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <arch>x86_64</arch>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <vcpu max='240'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <iothreads supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <os supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='firmware'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <loader supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>rom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pflash</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='readonly'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>yes</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='secure'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>no</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </loader>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='maximumMigratable'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>on</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>off</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <vendor>AMD</vendor>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='succor'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <mode name='custom' supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Denverton-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='auto-ibrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amd-psfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='stibp-always-on'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='EPYC-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 podman[187382]: 2025-11-29 06:41:32.335196662 +0000 UTC m=+0.162920460 container init fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-128'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-256'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx10-512'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='prefetchiti'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Haswell-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 podman[187382]: 2025-11-29 06:41:32.343778419 +0000 UTC m=+0.171502187 container start fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512er'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512pf'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fma4'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tbm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xop'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 python3.9[187339]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='amx-tile'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-bf16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-fp16'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bitalg'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrc'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fzrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='la57'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='taa-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xfd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ifma'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cmpccxadd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fbsdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='fsrs'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ibrs-all'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mcdt-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pbrsb-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='psdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='serialize'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vaes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='hle'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='rtm'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512bw'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512cd'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512dq'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512f'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='avx512vl'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='invpcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pcid'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='pku'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='mpx'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='core-capability'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='split-lock-detect'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='cldemote'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='erms'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='gfni'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdir64b'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='movdiri'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='xsaves'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='athlon-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='core2duo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='coreduo-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='n270-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='ss'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <blockers model='phenom-v1'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnow'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <feature name='3dnowext'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </blockers>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </mode>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <memoryBacking supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <enum name='sourceType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>anonymous</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <value>memfd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </memoryBacking>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <disk supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='diskDevice'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>disk</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cdrom</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>floppy</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>lun</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ide</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>fdc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>sata</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <graphics supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vnc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egl-headless</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <video supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='modelType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vga</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>cirrus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>none</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>bochs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ramfb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hostdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='mode'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>subsystem</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='startupPolicy'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>mandatory</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>requisite</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>optional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='subsysType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pci</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>scsi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='capsType'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='pciBackend'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hostdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <rng supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtio-non-transitional</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>random</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>egd</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <filesystem supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='driverType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>path</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>handle</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>virtiofs</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </filesystem>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <tpm supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-tis</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tpm-crb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emulator</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>external</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendVersion'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>2.0</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </tpm>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <redirdev supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='bus'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>usb</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </redirdev>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <channel supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </channel>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <crypto supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendModel'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>builtin</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </crypto>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <interface supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='backendType'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>default</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>passt</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <panic supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='model'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>isa</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>hyperv</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </panic>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <console supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='type'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>null</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vc</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pty</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dev</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>file</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>pipe</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stdio</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>udp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tcp</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>unix</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>qemu-vdagent</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>dbus</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </console>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <gic supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <genid supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <backup supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <async-teardown supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <ps2 supported='yes'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sev supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <sgx supported='no'/>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <hyperv supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='features'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>relaxed</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vapic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>spinlocks</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vpindex</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>runtime</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>synic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>stimer</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reset</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>vendor_id</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>frequencies</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>reenlightenment</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tlbflush</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>ipi</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>avic</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>emsr_bitmap</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>xmm_input</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </defaults>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </hyperv>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    <launchSecurity supported='yes'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      <enum name='sectype'>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:        <value>tdx</value>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:      </enum>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:    </launchSecurity>
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: </domainCapabilities>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.321 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.322 187156 INFO nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.326 187156 INFO nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.326 187156 INFO nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.338 187156 DEBUG nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:41:32 np0005539504 nova_compute[187152]:  <model>Nehalem</model>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: </cpu>
Nov 29 01:41:32 np0005539504 nova_compute[187152]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.342 187156 DEBUG nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.369 187156 INFO nova.virt.node [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Determined node identity 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from /var/lib/nova/compute_id#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.405 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Verified node 1c526389-06f6-4ffd-8e90-a84c6c39f0bc matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 01:41:32 np0005539504 nova_compute_init[187404]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 01:41:32 np0005539504 systemd[1]: libpod-fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106.scope: Deactivated successfully.
Nov 29 01:41:32 np0005539504 podman[187405]: 2025-11-29 06:41:32.420867208 +0000 UTC m=+0.043179288 container died fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:41:32 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106-userdata-shm.mount: Deactivated successfully.
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.488 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:41:32 np0005539504 systemd[1]: var-lib-containers-storage-overlay-4f48e69025457eb338709f4cff51b676fe6e5881accb5e439ed807e3ca455ec9-merged.mount: Deactivated successfully.
Nov 29 01:41:32 np0005539504 podman[187414]: 2025-11-29 06:41:32.500295889 +0000 UTC m=+0.063772205 container cleanup fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 01:41:32 np0005539504 systemd[1]: libpod-conmon-fa7d96ac31f925d5aa2456f311d13906127c23116664da29943af114b13a2106.scope: Deactivated successfully.
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.862 187156 DEBUG oslo_concurrency.lockutils [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.864 187156 DEBUG oslo_concurrency.lockutils [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.864 187156 DEBUG oslo_concurrency.lockutils [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:32 np0005539504 nova_compute[187152]: 2025-11-29 06:41:32.864 187156 DEBUG nova.compute.resource_tracker [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.057 187156 WARNING nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.059 187156 DEBUG nova.compute.resource_tracker [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6198MB free_disk=73.54742050170898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.059 187156 DEBUG oslo_concurrency.lockutils [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.059 187156 DEBUG oslo_concurrency.lockutils [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.543 187156 DEBUG nova.compute.resource_tracker [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.543 187156 DEBUG nova.compute.resource_tracker [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.588 187156 DEBUG nova.scheduler.client.report [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.809 187156 DEBUG nova.scheduler.client.report [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.810 187156 DEBUG nova.compute.provider_tree [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.835 187156 DEBUG nova.scheduler.client.report [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.884 187156 DEBUG nova.scheduler.client.report [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.940 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 01:41:33 np0005539504 nova_compute[187152]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.940 187156 INFO nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.941 187156 DEBUG nova.compute.provider_tree [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.941 187156 DEBUG nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.944 187156 DEBUG nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 01:41:33 np0005539504 nova_compute[187152]:  <arch>x86_64</arch>
Nov 29 01:41:33 np0005539504 nova_compute[187152]:  <model>Nehalem</model>
Nov 29 01:41:33 np0005539504 nova_compute[187152]:  <vendor>AMD</vendor>
Nov 29 01:41:33 np0005539504 nova_compute[187152]:  <topology sockets="8" cores="1" threads="1"/>
Nov 29 01:41:33 np0005539504 nova_compute[187152]: </cpu>
Nov 29 01:41:33 np0005539504 nova_compute[187152]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.964 187156 DEBUG nova.scheduler.client.report [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.988 187156 DEBUG nova.compute.resource_tracker [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.988 187156 DEBUG oslo_concurrency.lockutils [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:33 np0005539504 nova_compute[187152]: 2025-11-29 06:41:33.988 187156 DEBUG nova.service [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 29 01:41:34 np0005539504 nova_compute[187152]: 2025-11-29 06:41:34.048 187156 DEBUG nova.service [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 29 01:41:34 np0005539504 nova_compute[187152]: 2025-11-29 06:41:34.048 187156 DEBUG nova.servicegroup.drivers.db [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 29 01:41:34 np0005539504 systemd[1]: session-24.scope: Deactivated successfully.
Nov 29 01:41:34 np0005539504 systemd[1]: session-24.scope: Consumed 2min 4.085s CPU time.
Nov 29 01:41:34 np0005539504 systemd-logind[783]: Session 24 logged out. Waiting for processes to exit.
Nov 29 01:41:34 np0005539504 systemd-logind[783]: Removed session 24.
Nov 29 01:41:40 np0005539504 systemd-logind[783]: New session 26 of user zuul.
Nov 29 01:41:40 np0005539504 systemd[1]: Started Session 26 of User zuul.
Nov 29 01:41:41 np0005539504 python3.9[187623]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:41:42 np0005539504 podman[187779]: 2025-11-29 06:41:42.979298218 +0000 UTC m=+0.115706155 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:41:43 np0005539504 python3.9[187780]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:43 np0005539504 systemd[1]: Reloading.
Nov 29 01:41:43 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:43 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:44 np0005539504 python3.9[187992]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:41:44 np0005539504 network[188009]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:41:44 np0005539504 network[188010]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:41:44 np0005539504 network[188011]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:41:50 np0005539504 python3.9[188287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:50 np0005539504 podman[188289]: 2025-11-29 06:41:50.628364246 +0000 UTC m=+0.084832356 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:41:51 np0005539504 python3.9[188460]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:51 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:51 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:52 np0005539504 python3.9[188613]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:53 np0005539504 python3.9[188765]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:54 np0005539504 python3.9[188917]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:41:55 np0005539504 python3.9[189071]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:55 np0005539504 systemd[1]: Reloading.
Nov 29 01:41:55 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:55 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:56 np0005539504 python3.9[189258]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:57 np0005539504 python3.9[189411]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:58 np0005539504 python3.9[189561]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:59 np0005539504 python3.9[189715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:00 np0005539504 podman[189810]: 2025-11-29 06:42:00.015878969 +0000 UTC m=+0.065982215 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:42:00 np0005539504 python3.9[189847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398518.929173-364-37012824344016/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:01 np0005539504 python3.9[190007]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 29 01:42:02 np0005539504 python3.9[190159]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 29 01:42:03 np0005539504 python3.9[190312]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:42:04 np0005539504 python3.9[190472]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:42:06 np0005539504 python3.9[190630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:06 np0005539504 python3.9[190751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398525.6712236-568-188744616364316/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:07 np0005539504 python3.9[190901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:07 np0005539504 python3.9[191022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398526.9631193-568-261841323440549/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:08 np0005539504 python3.9[191172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:09 np0005539504 python3.9[191293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398528.1768105-568-244553984237007/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:10 np0005539504 python3.9[191443]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:10 np0005539504 python3.9[191595]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:11 np0005539504 python3.9[191747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:12 np0005539504 python3.9[191868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398531.1832874-745-254884871936245/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:12 np0005539504 python3.9[192018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:13 np0005539504 podman[192068]: 2025-11-29 06:42:13.401711904 +0000 UTC m=+0.126252386 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 01:42:13 np0005539504 python3.9[192105]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:14 np0005539504 python3.9[192271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:14 np0005539504 python3.9[192392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398533.6897488-745-157096359983454/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:15 np0005539504 python3.9[192542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:16 np0005539504 python3.9[192663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398535.1423233-745-106441219547547/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:16 np0005539504 python3.9[192815]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:17 np0005539504 python3.9[192936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398536.427419-745-176164571437555/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:18 np0005539504 nova_compute[187152]: 2025-11-29 06:42:18.050 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:18 np0005539504 nova_compute[187152]: 2025-11-29 06:42:18.074 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:18 np0005539504 python3.9[193086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:19 np0005539504 python3.9[193207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398537.6745741-745-235872554205572/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:19 np0005539504 auditd[700]: Audit daemon rotating log files
Nov 29 01:42:19 np0005539504 python3.9[193357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:20 np0005539504 python3.9[193478]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.1683354-745-141017614043492/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:20 np0005539504 podman[193602]: 2025-11-29 06:42:20.77675913 +0000 UTC m=+0.066653394 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:42:20 np0005539504 python3.9[193644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:21 np0005539504 python3.9[193769]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398540.41267-745-205929906028549/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:22 np0005539504 python3.9[193919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:22 np0005539504 python3.9[194040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398541.684847-745-279390281104052/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:42:22.895 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:42:22.895 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:42:22.896 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:23 np0005539504 python3.9[194190]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:24 np0005539504 python3.9[194311]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.8786628-745-131045346280526/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:24 np0005539504 python3.9[194461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:25 np0005539504 python3.9[194582]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398544.3068864-745-248119390597793/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:26 np0005539504 python3.9[194732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:26 np0005539504 python3.9[194808]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:27 np0005539504 python3.9[194958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:27 np0005539504 python3.9[195034]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:28 np0005539504 python3.9[195184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:28 np0005539504 python3.9[195260]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:29 np0005539504 python3.9[195412]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:30 np0005539504 podman[195536]: 2025-11-29 06:42:30.338562506 +0000 UTC m=+0.066144409 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:42:30 np0005539504 python3.9[195577]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.940 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.940 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.940 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.957 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.959 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.959 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.959 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.959 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.992 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.993 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.993 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:30 np0005539504 nova_compute[187152]: 2025-11-29 06:42:30.994 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.180 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.181 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6157MB free_disk=73.54832458496094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.181 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.182 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:31 np0005539504 python3.9[195735]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.276 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.277 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.310 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.326 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.328 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:42:31 np0005539504 nova_compute[187152]: 2025-11-29 06:42:31.328 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:32 np0005539504 python3.9[195887]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:42:33 np0005539504 systemd[1]: Reloading.
Nov 29 01:42:33 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:33 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:33 np0005539504 systemd[1]: Listening on Podman API Socket.
Nov 29 01:42:34 np0005539504 python3.9[196079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:34 np0005539504 python3.9[196202]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.8952208-1411-51448117125832/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:35 np0005539504 python3.9[196278]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:35 np0005539504 python3.9[196401]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.8952208-1411-51448117125832/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:37 np0005539504 python3.9[196553]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 29 01:42:38 np0005539504 python3.9[196705]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:39 np0005539504 python3[196857]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:39 np0005539504 podman[196892]: 2025-11-29 06:42:39.652142648 +0000 UTC m=+0.057387134 container create 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 01:42:39 np0005539504 podman[196892]: 2025-11-29 06:42:39.618959216 +0000 UTC m=+0.024203692 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 29 01:42:39 np0005539504 python3[196857]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 29 01:42:40 np0005539504 python3.9[197082]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:41 np0005539504 python3.9[197236]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:42 np0005539504 python3.9[197387]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398561.4084191-1603-79723167152144/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:42 np0005539504 python3.9[197463]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:42:42 np0005539504 systemd[1]: Reloading.
Nov 29 01:42:43 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:43 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:43 np0005539504 podman[197574]: 2025-11-29 06:42:43.635158581 +0000 UTC m=+0.138538646 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:43 np0005539504 python3.9[197575]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:42:43 np0005539504 systemd[1]: Reloading.
Nov 29 01:42:43 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:43 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:44 np0005539504 systemd[1]: Starting ceilometer_agent_compute container...
Nov 29 01:42:44 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:42:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.
Nov 29 01:42:44 np0005539504 podman[197640]: 2025-11-29 06:42:44.602000604 +0000 UTC m=+0.448302354 container init 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + sudo -E kolla_set_configs
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:44 np0005539504 podman[197640]: 2025-11-29 06:42:44.634036775 +0000 UTC m=+0.480338525 container start 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:42:44 np0005539504 podman[197640]: ceilometer_agent_compute
Nov 29 01:42:44 np0005539504 systemd[1]: Started ceilometer_agent_compute container.
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Validating config file
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Copying service configuration files
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: INFO:__main__:Writing out command to execute
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: ++ cat /run_command
Nov 29 01:42:44 np0005539504 podman[197663]: 2025-11-29 06:42:44.695325213 +0000 UTC m=+0.052534184 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + ARGS=
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + sudo kolla_copy_cacerts
Nov 29 01:42:44 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-1d55be7b569b9051.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:42:44 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-1d55be7b569b9051.service: Failed with result 'exit-code'.
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + [[ ! -n '' ]]
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + . kolla_extend_start
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + umask 0022
Nov 29 01:42:44 np0005539504 ceilometer_agent_compute[197654]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.593 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.594 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.595 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.596 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.597 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.598 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.599 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.601 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.602 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.603 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.604 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.605 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.606 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.607 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.608 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.609 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.609 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.609 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.609 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.609 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.628 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.629 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.630 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.740 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.825 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.825 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.825 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.825 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.826 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.827 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.828 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.829 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.830 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.831 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.832 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.833 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.834 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.835 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.836 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.837 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.838 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.839 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.840 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.841 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.842 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.843 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.844 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.845 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.846 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.847 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.850 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.857 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.861 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.863 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.864 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.865 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 python3.9[197839]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:45.868 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539504 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:46.115 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:46.217 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:46.217 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:46.217 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197654]: 2025-11-29 06:42:46.228 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 29 01:42:46 np0005539504 virtqemud[186569]: End of file while reading data: Input/output error
Nov 29 01:42:46 np0005539504 virtqemud[186569]: End of file while reading data: Input/output error
Nov 29 01:42:46 np0005539504 systemd[1]: libpod-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope: Deactivated successfully.
Nov 29 01:42:46 np0005539504 systemd[1]: libpod-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope: Consumed 1.425s CPU time.
Nov 29 01:42:46 np0005539504 conmon[197654]: conmon 7ce28180c360137fda4c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope/container/memory.events
Nov 29 01:42:46 np0005539504 podman[197849]: 2025-11-29 06:42:46.393474017 +0000 UTC m=+0.469934195 container died 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 01:42:46 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-1d55be7b569b9051.timer: Deactivated successfully.
Nov 29 01:42:46 np0005539504 systemd[1]: Stopped /usr/bin/podman healthcheck run 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.
Nov 29 01:42:46 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-userdata-shm.mount: Deactivated successfully.
Nov 29 01:42:46 np0005539504 systemd[1]: var-lib-containers-storage-overlay-46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216-merged.mount: Deactivated successfully.
Nov 29 01:42:46 np0005539504 podman[197849]: 2025-11-29 06:42:46.444842467 +0000 UTC m=+0.521302635 container cleanup 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:46 np0005539504 podman[197849]: ceilometer_agent_compute
Nov 29 01:42:46 np0005539504 podman[197879]: ceilometer_agent_compute
Nov 29 01:42:46 np0005539504 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 29 01:42:46 np0005539504 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 29 01:42:46 np0005539504 systemd[1]: Starting ceilometer_agent_compute container...
Nov 29 01:42:46 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:42:46 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f25f99c0cade2657b5998da2010abb6dd9fdda286b784dbe7b8faf4e811216/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.
Nov 29 01:42:46 np0005539504 podman[197892]: 2025-11-29 06:42:46.686244048 +0000 UTC m=+0.135837823 container init 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + sudo -E kolla_set_configs
Nov 29 01:42:46 np0005539504 podman[197892]: 2025-11-29 06:42:46.718597028 +0000 UTC m=+0.168190803 container start 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:46 np0005539504 podman[197892]: ceilometer_agent_compute
Nov 29 01:42:46 np0005539504 systemd[1]: Started ceilometer_agent_compute container.
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Validating config file
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Copying service configuration files
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 29 01:42:46 np0005539504 podman[197915]: 2025-11-29 06:42:46.785417654 +0000 UTC m=+0.054238999 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: INFO:__main__:Writing out command to execute
Nov 29 01:42:46 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-37f642357cbcdbd.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:42:46 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-37f642357cbcdbd.service: Failed with result 'exit-code'.
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: ++ cat /run_command
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + ARGS=
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + sudo kolla_copy_cacerts
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + [[ ! -n '' ]]
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + . kolla_extend_start
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + umask 0022
Nov 29 01:42:46 np0005539504 ceilometer_agent_compute[197907]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 29 01:42:47 np0005539504 python3.9[198089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.717 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.717 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.717 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.717 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.718 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.719 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.720 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.721 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.722 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.723 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.724 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.725 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.726 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.727 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.728 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.729 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.730 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.731 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.732 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.733 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.734 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.734 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.734 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.734 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.734 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.754 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.756 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.757 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.770 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.907 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.907 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.907 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.907 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.907 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.908 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.909 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.910 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.911 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.912 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.913 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.913 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.913 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.913 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.914 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.914 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.914 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.914 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.914 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.914 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.915 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.916 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.917 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.918 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.919 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.920 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.921 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.922 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.923 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.924 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.925 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.926 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.927 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.928 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.929 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.929 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.929 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.929 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.929 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.934 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.944 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.953 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.953 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.953 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.953 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.954 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.954 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.954 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.954 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:42:47.956 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539504 python3.9[198218]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398567.209866-1699-261636329477236/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:49 np0005539504 python3.9[198370]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 29 01:42:50 np0005539504 python3.9[198524]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:51 np0005539504 podman[198648]: 2025-11-29 06:42:51.17588169 +0000 UTC m=+0.082033366 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:42:51 np0005539504 python3[198693]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:51 np0005539504 podman[198732]: 2025-11-29 06:42:51.668489464 +0000 UTC m=+0.027781478 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 29 01:42:52 np0005539504 podman[198732]: 2025-11-29 06:42:52.032676205 +0000 UTC m=+0.391968209 container create 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Nov 29 01:42:52 np0005539504 python3[198693]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 29 01:42:55 np0005539504 python3.9[198924]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:56 np0005539504 python3.9[199078]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:57 np0005539504 python3.9[199229]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398576.7930653-1858-224055541606952/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:58 np0005539504 python3.9[199305]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:42:58 np0005539504 systemd[1]: Reloading.
Nov 29 01:42:58 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:58 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:58 np0005539504 python3.9[199416]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:00 np0005539504 systemd[1]: Reloading.
Nov 29 01:43:00 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:00 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:00 np0005539504 systemd[1]: Starting node_exporter container...
Nov 29 01:43:00 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:43:00 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c87299ad6479bbe90a013d48d2e50f8091751d808f5383cd2545ce1820c6c8c7/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c87299ad6479bbe90a013d48d2e50f8091751d808f5383cd2545ce1820c6c8c7/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539504 podman[199471]: 2025-11-29 06:43:00.460258237 +0000 UTC m=+0.059929212 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:43:00 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.
Nov 29 01:43:00 np0005539504 podman[199458]: 2025-11-29 06:43:00.475182659 +0000 UTC m=+0.146547601 container init 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.490Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.490Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.490Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=arp
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=bcache
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=bonding
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=cpu
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=edac
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.491Z caller=node_exporter.go:117 level=info collector=filefd
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=netclass
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=netdev
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=netstat
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=nfs
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=nvme
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=softnet
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=systemd
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=xfs
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.492Z caller=node_exporter.go:117 level=info collector=zfs
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.493Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 29 01:43:00 np0005539504 node_exporter[199474]: ts=2025-11-29T06:43:00.493Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 29 01:43:00 np0005539504 podman[199458]: 2025-11-29 06:43:00.500770747 +0000 UTC m=+0.172135599 container start 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:43:00 np0005539504 podman[199458]: node_exporter
Nov 29 01:43:00 np0005539504 systemd[1]: Started node_exporter container.
Nov 29 01:43:00 np0005539504 podman[199501]: 2025-11-29 06:43:00.567604854 +0000 UTC m=+0.054463696 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:01 np0005539504 python3.9[199677]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:01 np0005539504 systemd[1]: Stopping node_exporter container...
Nov 29 01:43:01 np0005539504 systemd[1]: libpod-060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.scope: Deactivated successfully.
Nov 29 01:43:01 np0005539504 podman[199681]: 2025-11-29 06:43:01.424777108 +0000 UTC m=+0.057656161 container died 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:43:01 np0005539504 systemd[1]: 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c-55a4cab14ad7641c.timer: Deactivated successfully.
Nov 29 01:43:01 np0005539504 systemd[1]: Stopped /usr/bin/podman healthcheck run 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.
Nov 29 01:43:01 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:01 np0005539504 systemd[1]: var-lib-containers-storage-overlay-c87299ad6479bbe90a013d48d2e50f8091751d808f5383cd2545ce1820c6c8c7-merged.mount: Deactivated successfully.
Nov 29 01:43:01 np0005539504 podman[199681]: 2025-11-29 06:43:01.464774554 +0000 UTC m=+0.097653617 container cleanup 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:43:01 np0005539504 podman[199681]: node_exporter
Nov 29 01:43:01 np0005539504 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 01:43:01 np0005539504 podman[199707]: node_exporter
Nov 29 01:43:01 np0005539504 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 29 01:43:01 np0005539504 systemd[1]: Stopped node_exporter container.
Nov 29 01:43:01 np0005539504 systemd[1]: Starting node_exporter container...
Nov 29 01:43:01 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:43:01 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c87299ad6479bbe90a013d48d2e50f8091751d808f5383cd2545ce1820c6c8c7/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c87299ad6479bbe90a013d48d2e50f8091751d808f5383cd2545ce1820c6c8c7/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.
Nov 29 01:43:01 np0005539504 podman[199720]: 2025-11-29 06:43:01.680720329 +0000 UTC m=+0.124419726 container init 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.695Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.695Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.695Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.696Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.696Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.696Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.696Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.696Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.696Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=arp
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=bcache
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=bonding
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=cpu
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=edac
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=filefd
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=netclass
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=netdev
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=netstat
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=nfs
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=nvme
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=softnet
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=systemd
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=xfs
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=node_exporter.go:117 level=info collector=zfs
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.697Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 29 01:43:01 np0005539504 node_exporter[199736]: ts=2025-11-29T06:43:01.698Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 29 01:43:01 np0005539504 podman[199720]: 2025-11-29 06:43:01.713676696 +0000 UTC m=+0.157376113 container start 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:43:01 np0005539504 podman[199720]: node_exporter
Nov 29 01:43:01 np0005539504 systemd[1]: Started node_exporter container.
Nov 29 01:43:01 np0005539504 podman[199746]: 2025-11-29 06:43:01.790204213 +0000 UTC m=+0.058259147 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:02 np0005539504 python3.9[199922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:43:03 np0005539504 python3.9[200045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398581.9431927-1954-194375633980649/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:43:03 np0005539504 python3.9[200197]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 29 01:43:04 np0005539504 python3.9[200349]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:43:05 np0005539504 python3[200501]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:43:07 np0005539504 podman[200513]: 2025-11-29 06:43:07.44723963 +0000 UTC m=+1.529229264 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 29 01:43:07 np0005539504 podman[200608]: 2025-11-29 06:43:07.574774129 +0000 UTC m=+0.021113719 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 29 01:43:08 np0005539504 podman[200608]: 2025-11-29 06:43:08.097286027 +0000 UTC m=+0.543625577 container create 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:43:08 np0005539504 python3[200501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 29 01:43:09 np0005539504 python3.9[200798]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:09 np0005539504 python3.9[200952]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:10 np0005539504 python3.9[201103]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398590.0176153-2113-136616456300727/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:11 np0005539504 python3.9[201179]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:43:11 np0005539504 systemd[1]: Reloading.
Nov 29 01:43:11 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:11 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:12 np0005539504 python3.9[201291]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:12 np0005539504 systemd[1]: Reloading.
Nov 29 01:43:12 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:12 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:12 np0005539504 systemd[1]: Starting podman_exporter container...
Nov 29 01:43:12 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:43:12 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/369a7daeb6eb2c17c739ba1c4b65d0aa037d0b6cb2d66e39ded0732255c8ca60/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/369a7daeb6eb2c17c739ba1c4b65d0aa037d0b6cb2d66e39ded0732255c8ca60/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.
Nov 29 01:43:12 np0005539504 podman[201332]: 2025-11-29 06:43:12.998298179 +0000 UTC m=+0.169994571 container init 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.017Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.017Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.017Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.017Z caller=handler.go:105 level=info collector=container
Nov 29 01:43:13 np0005539504 podman[201332]: 2025-11-29 06:43:13.033536416 +0000 UTC m=+0.205232788 container start 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:43:13 np0005539504 podman[201332]: podman_exporter
Nov 29 01:43:13 np0005539504 systemd[1]: Starting Podman API Service...
Nov 29 01:43:13 np0005539504 systemd[1]: Started podman_exporter container.
Nov 29 01:43:13 np0005539504 systemd[1]: Started Podman API Service.
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="Setting parallel job count to 25"
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="Using sqlite as database backend"
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 29 01:43:13 np0005539504 podman[201359]: @ - - [29/Nov/2025:06:43:13 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 29 01:43:13 np0005539504 podman[201359]: time="2025-11-29T06:43:13Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 01:43:13 np0005539504 podman[201357]: 2025-11-29 06:43:13.173287563 +0000 UTC m=+0.128661020 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:13 np0005539504 systemd[1]: 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f-161b10bbf8e64d90.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:43:13 np0005539504 systemd[1]: 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f-161b10bbf8e64d90.service: Failed with result 'exit-code'.
Nov 29 01:43:13 np0005539504 podman[201359]: @ - - [29/Nov/2025:06:43:13 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.190Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.191Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 29 01:43:13 np0005539504 podman_exporter[201348]: ts=2025-11-29T06:43:13.192Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 29 01:43:13 np0005539504 podman[201545]: 2025-11-29 06:43:13.882488751 +0000 UTC m=+0.157668081 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:43:14 np0005539504 python3.9[201546]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:14 np0005539504 systemd[1]: Stopping podman_exporter container...
Nov 29 01:43:14 np0005539504 podman[201359]: @ - - [29/Nov/2025:06:43:13 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3437 "" "Go-http-client/1.1"
Nov 29 01:43:14 np0005539504 systemd[1]: libpod-1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.scope: Deactivated successfully.
Nov 29 01:43:14 np0005539504 podman[201575]: 2025-11-29 06:43:14.138284667 +0000 UTC m=+0.056689204 container died 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:43:14 np0005539504 systemd[1]: 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f-161b10bbf8e64d90.timer: Deactivated successfully.
Nov 29 01:43:14 np0005539504 systemd[1]: Stopped /usr/bin/podman healthcheck run 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.
Nov 29 01:43:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay-369a7daeb6eb2c17c739ba1c4b65d0aa037d0b6cb2d66e39ded0732255c8ca60-merged.mount: Deactivated successfully.
Nov 29 01:43:14 np0005539504 podman[201575]: 2025-11-29 06:43:14.472735609 +0000 UTC m=+0.391140116 container cleanup 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:43:14 np0005539504 podman[201575]: podman_exporter
Nov 29 01:43:14 np0005539504 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 01:43:14 np0005539504 podman[201604]: podman_exporter
Nov 29 01:43:14 np0005539504 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 29 01:43:14 np0005539504 systemd[1]: Stopped podman_exporter container.
Nov 29 01:43:14 np0005539504 systemd[1]: Starting podman_exporter container...
Nov 29 01:43:14 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:43:14 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/369a7daeb6eb2c17c739ba1c4b65d0aa037d0b6cb2d66e39ded0732255c8ca60/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:14 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/369a7daeb6eb2c17c739ba1c4b65d0aa037d0b6cb2d66e39ded0732255c8ca60/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:14 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.
Nov 29 01:43:14 np0005539504 podman[201618]: 2025-11-29 06:43:14.702337181 +0000 UTC m=+0.121078595 container init 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.717Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.717Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.717Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.717Z caller=handler.go:105 level=info collector=container
Nov 29 01:43:14 np0005539504 podman[201359]: @ - - [29/Nov/2025:06:43:14 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 29 01:43:14 np0005539504 podman[201359]: time="2025-11-29T06:43:14Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 01:43:14 np0005539504 podman[201618]: 2025-11-29 06:43:14.734283541 +0000 UTC m=+0.153024865 container start 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:43:14 np0005539504 podman[201618]: podman_exporter
Nov 29 01:43:14 np0005539504 systemd[1]: Started podman_exporter container.
Nov 29 01:43:14 np0005539504 podman[201359]: @ - - [29/Nov/2025:06:43:14 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19570 "" "Go-http-client/1.1"
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.754Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.755Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 29 01:43:14 np0005539504 podman_exporter[201634]: ts=2025-11-29T06:43:14.755Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 29 01:43:14 np0005539504 podman[201644]: 2025-11-29 06:43:14.808301201 +0000 UTC m=+0.057547368 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:43:16 np0005539504 python3.9[201819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:43:16 np0005539504 python3.9[201942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398595.5520098-2209-99217991423075/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:43:17 np0005539504 podman[202066]: 2025-11-29 06:43:17.479508636 +0000 UTC m=+0.081548584 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:43:17 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-37f642357cbcdbd.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:43:17 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-37f642357cbcdbd.service: Failed with result 'exit-code'.
Nov 29 01:43:17 np0005539504 python3.9[202108]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 29 01:43:18 np0005539504 python3.9[202263]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:43:19 np0005539504 python3[202415]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:43:22 np0005539504 podman[202472]: 2025-11-29 06:43:22.132445242 +0000 UTC m=+0.894796417 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:43:22 np0005539504 podman[202428]: 2025-11-29 06:43:22.574154915 +0000 UTC m=+3.130837573 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 01:43:22 np0005539504 podman[202541]: 2025-11-29 06:43:22.734215463 +0000 UTC m=+0.055506012 container create 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Nov 29 01:43:22 np0005539504 podman[202541]: 2025-11-29 06:43:22.702622409 +0000 UTC m=+0.023912998 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 01:43:22 np0005539504 python3[202415]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 01:43:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:43:22.896 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:43:22.897 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:43:22.897 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:26 np0005539504 python3.9[202733]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:27 np0005539504 python3.9[202887]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:28 np0005539504 python3.9[203038]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398607.5318217-2368-156013240547912/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:28 np0005539504 python3.9[203114]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:43:28 np0005539504 systemd[1]: Reloading.
Nov 29 01:43:28 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:28 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:29 np0005539504 python3.9[203225]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:29 np0005539504 systemd[1]: Reloading.
Nov 29 01:43:29 np0005539504 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:29 np0005539504 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:30 np0005539504 systemd[1]: Starting openstack_network_exporter container...
Nov 29 01:43:30 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:43:30 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:30 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:30 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:30 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.
Nov 29 01:43:30 np0005539504 podman[203265]: 2025-11-29 06:43:30.335235175 +0000 UTC m=+0.145412035 container init 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *bridge.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *coverage.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *datapath.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *iface.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *memory.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *ovnnorthd.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *ovn.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *ovsdbserver.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *pmd_perf.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *pmd_rxq.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: INFO    06:43:30 main.go:48: registering *vswitch.Collector
Nov 29 01:43:30 np0005539504 openstack_network_exporter[203280]: NOTICE  06:43:30 main.go:76: listening on https://:9105/metrics
Nov 29 01:43:30 np0005539504 podman[203265]: 2025-11-29 06:43:30.365618107 +0000 UTC m=+0.175794967 container start 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Nov 29 01:43:30 np0005539504 podman[203265]: openstack_network_exporter
Nov 29 01:43:30 np0005539504 systemd[1]: Started openstack_network_exporter container.
Nov 29 01:43:30 np0005539504 podman[203288]: 2025-11-29 06:43:30.46963448 +0000 UTC m=+0.087929715 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_id=edpm, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 01:43:30 np0005539504 podman[203337]: 2025-11-29 06:43:30.549085949 +0000 UTC m=+0.054532547 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.320 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.322 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.345 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.346 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.346 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 python3.9[203485]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:31 np0005539504 systemd[1]: Stopping openstack_network_exporter container...
Nov 29 01:43:31 np0005539504 systemd[1]: libpod-892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.scope: Deactivated successfully.
Nov 29 01:43:31 np0005539504 podman[203489]: 2025-11-29 06:43:31.905989606 +0000 UTC m=+0.070090432 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:43:31 np0005539504 podman[203490]: 2025-11-29 06:43:31.9057308 +0000 UTC m=+0.069504408 container died 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7)
Nov 29 01:43:31 np0005539504 systemd[1]: 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4-15ffd391a73a578e.timer: Deactivated successfully.
Nov 29 01:43:31 np0005539504 systemd[1]: Stopped /usr/bin/podman healthcheck run 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:43:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay-268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865-merged.mount: Deactivated successfully.
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.956 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.956 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.957 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.957 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.991 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.992 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.992 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:31 np0005539504 nova_compute[187152]: 2025-11-29 06:43:31.992 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.174 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.175 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5906MB free_disk=73.3689956665039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.175 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.175 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.304 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.305 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.330 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.352 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.353 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:43:32 np0005539504 nova_compute[187152]: 2025-11-29 06:43:32.353 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:33 np0005539504 nova_compute[187152]: 2025-11-29 06:43:33.335 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:33 np0005539504 nova_compute[187152]: 2025-11-29 06:43:33.335 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:33 np0005539504 podman[203490]: 2025-11-29 06:43:33.499075965 +0000 UTC m=+1.662849603 container cleanup 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, release=1755695350)
Nov 29 01:43:33 np0005539504 podman[203490]: openstack_network_exporter
Nov 29 01:43:33 np0005539504 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 01:43:33 np0005539504 podman[203540]: openstack_network_exporter
Nov 29 01:43:33 np0005539504 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 29 01:43:33 np0005539504 systemd[1]: Stopped openstack_network_exporter container.
Nov 29 01:43:33 np0005539504 systemd[1]: Starting openstack_network_exporter container...
Nov 29 01:43:33 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:43:33 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:33 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:33 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/268bedc7eb7fe313316d5b6aac6e81803f12635c7aaebefb5b35d8a4c3695865/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:33 np0005539504 systemd[1]: Started /usr/bin/podman healthcheck run 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.
Nov 29 01:43:33 np0005539504 podman[203552]: 2025-11-29 06:43:33.748572854 +0000 UTC m=+0.128072234 container init 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *bridge.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *coverage.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *datapath.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *iface.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *memory.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *ovnnorthd.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *ovn.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *ovsdbserver.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *pmd_perf.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *pmd_rxq.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: INFO    06:43:33 main.go:48: registering *vswitch.Collector
Nov 29 01:43:33 np0005539504 openstack_network_exporter[203567]: NOTICE  06:43:33 main.go:76: listening on https://:9105/metrics
Nov 29 01:43:33 np0005539504 podman[203552]: 2025-11-29 06:43:33.792819259 +0000 UTC m=+0.172318649 container start 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 01:43:33 np0005539504 podman[203552]: openstack_network_exporter
Nov 29 01:43:33 np0005539504 systemd[1]: Started openstack_network_exporter container.
Nov 29 01:43:33 np0005539504 podman[203577]: 2025-11-29 06:43:33.875941981 +0000 UTC m=+0.072075732 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:43:36 np0005539504 python3.9[203750]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:43:44 np0005539504 podman[203777]: 2025-11-29 06:43:44.732258277 +0000 UTC m=+0.080118547 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller)
Nov 29 01:43:45 np0005539504 podman[203802]: 2025-11-29 06:43:45.75170355 +0000 UTC m=+0.083803849 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:43:47 np0005539504 podman[203825]: 2025-11-29 06:43:47.726799615 +0000 UTC m=+0.068019010 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 29 01:43:47 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-37f642357cbcdbd.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:43:47 np0005539504 systemd[1]: 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5-37f642357cbcdbd.service: Failed with result 'exit-code'.
Nov 29 01:43:52 np0005539504 podman[203844]: 2025-11-29 06:43:52.74655969 +0000 UTC m=+0.085073132 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:44:00 np0005539504 podman[203864]: 2025-11-29 06:44:00.744771694 +0000 UTC m=+0.079537802 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 01:44:02 np0005539504 podman[203884]: 2025-11-29 06:44:02.730533881 +0000 UTC m=+0.070231707 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:44:04 np0005539504 podman[204036]: 2025-11-29 06:44:04.018880786 +0000 UTC m=+0.068174514 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 29 01:44:04 np0005539504 python3.9[204037]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 29 01:44:04 np0005539504 python3.9[204223]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:04 np0005539504 systemd[1]: Started libpod-conmon-35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927.scope.
Nov 29 01:44:05 np0005539504 podman[204224]: 2025-11-29 06:44:05.009403083 +0000 UTC m=+0.109911843 container exec 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:44:05 np0005539504 podman[204224]: 2025-11-29 06:44:05.040609856 +0000 UTC m=+0.141118586 container exec_died 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 01:44:05 np0005539504 systemd[1]: libpod-conmon-35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927.scope: Deactivated successfully.
Nov 29 01:44:05 np0005539504 python3.9[204404]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:05 np0005539504 systemd[1]: Started libpod-conmon-35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927.scope.
Nov 29 01:44:05 np0005539504 podman[204405]: 2025-11-29 06:44:05.945822577 +0000 UTC m=+0.084363915 container exec 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:44:05 np0005539504 podman[204405]: 2025-11-29 06:44:05.977616585 +0000 UTC m=+0.116157913 container exec_died 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 29 01:44:06 np0005539504 systemd[1]: libpod-conmon-35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927.scope: Deactivated successfully.
Nov 29 01:44:06 np0005539504 python3.9[204589]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:07 np0005539504 python3.9[204741]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 29 01:44:08 np0005539504 python3.9[204906]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:08 np0005539504 systemd[1]: Started libpod-conmon-d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7.scope.
Nov 29 01:44:08 np0005539504 podman[204907]: 2025-11-29 06:44:08.368290339 +0000 UTC m=+0.090186222 container exec d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 01:44:08 np0005539504 podman[204907]: 2025-11-29 06:44:08.402051766 +0000 UTC m=+0.123947639 container exec_died d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:44:08 np0005539504 systemd[1]: libpod-conmon-d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7.scope: Deactivated successfully.
Nov 29 01:44:09 np0005539504 python3.9[205089]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:09 np0005539504 systemd[1]: Started libpod-conmon-d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7.scope.
Nov 29 01:44:09 np0005539504 podman[205090]: 2025-11-29 06:44:09.170134523 +0000 UTC m=+0.073064047 container exec d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:44:09 np0005539504 podman[205108]: 2025-11-29 06:44:09.23261196 +0000 UTC m=+0.051390176 container exec_died d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 01:44:09 np0005539504 podman[205090]: 2025-11-29 06:44:09.238337875 +0000 UTC m=+0.141267389 container exec_died d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:44:09 np0005539504 systemd[1]: libpod-conmon-d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7.scope: Deactivated successfully.
Nov 29 01:44:09 np0005539504 python3.9[205272]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:10 np0005539504 python3.9[205424]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 29 01:44:11 np0005539504 python3.9[205589]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:11 np0005539504 systemd[1]: Started libpod-conmon-494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.scope.
Nov 29 01:44:11 np0005539504 podman[205590]: 2025-11-29 06:44:11.564116161 +0000 UTC m=+0.089191557 container exec 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:44:11 np0005539504 podman[205610]: 2025-11-29 06:44:11.67155606 +0000 UTC m=+0.091364752 container exec_died 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:44:11 np0005539504 podman[205590]: 2025-11-29 06:44:11.676106087 +0000 UTC m=+0.201181373 container exec_died 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 01:44:11 np0005539504 systemd[1]: libpod-conmon-494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.scope: Deactivated successfully.
Nov 29 01:44:12 np0005539504 python3.9[205772]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:12 np0005539504 systemd[1]: Started libpod-conmon-494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.scope.
Nov 29 01:44:12 np0005539504 podman[205773]: 2025-11-29 06:44:12.535978325 +0000 UTC m=+0.093201139 container exec 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:44:12 np0005539504 podman[205773]: 2025-11-29 06:44:12.566630504 +0000 UTC m=+0.123853298 container exec_died 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 01:44:12 np0005539504 systemd[1]: libpod-conmon-494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d.scope: Deactivated successfully.
Nov 29 01:44:13 np0005539504 python3.9[205955]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:14 np0005539504 python3.9[206107]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 29 01:44:14 np0005539504 python3.9[206272]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:14 np0005539504 systemd[1]: Started libpod-conmon-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope.
Nov 29 01:44:14 np0005539504 podman[206273]: 2025-11-29 06:44:14.909165135 +0000 UTC m=+0.074822733 container exec 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:44:14 np0005539504 podman[206273]: 2025-11-29 06:44:14.938798058 +0000 UTC m=+0.104455656 container exec_died 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:44:14 np0005539504 systemd[1]: libpod-conmon-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope: Deactivated successfully.
Nov 29 01:44:15 np0005539504 podman[206291]: 2025-11-29 06:44:15.016881302 +0000 UTC m=+0.104773244 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:44:15 np0005539504 python3.9[206482]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:15 np0005539504 systemd[1]: Started libpod-conmon-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope.
Nov 29 01:44:15 np0005539504 podman[206483]: 2025-11-29 06:44:15.729457158 +0000 UTC m=+0.089028174 container exec 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:44:15 np0005539504 podman[206483]: 2025-11-29 06:44:15.759190273 +0000 UTC m=+0.118761269 container exec_died 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:44:15 np0005539504 systemd[1]: libpod-conmon-7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5.scope: Deactivated successfully.
Nov 29 01:44:15 np0005539504 podman[206515]: 2025-11-29 06:44:15.857205873 +0000 UTC m=+0.048518664 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:44:16 np0005539504 python3.9[206690]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:17 np0005539504 python3.9[206842]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 29 01:44:17 np0005539504 podman[207007]: 2025-11-29 06:44:17.910421503 +0000 UTC m=+0.085794051 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:44:18 np0005539504 python3.9[207008]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:18 np0005539504 systemd[1]: Started libpod-conmon-060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.scope.
Nov 29 01:44:18 np0005539504 podman[207028]: 2025-11-29 06:44:18.15158664 +0000 UTC m=+0.136415087 container exec 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:44:18 np0005539504 podman[207028]: 2025-11-29 06:44:18.187713248 +0000 UTC m=+0.172541655 container exec_died 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:44:18 np0005539504 systemd[1]: libpod-conmon-060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.scope: Deactivated successfully.
Nov 29 01:44:18 np0005539504 python3.9[207211]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:19 np0005539504 systemd[1]: Started libpod-conmon-060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.scope.
Nov 29 01:44:19 np0005539504 podman[207212]: 2025-11-29 06:44:19.030148564 +0000 UTC m=+0.085150935 container exec 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:44:19 np0005539504 podman[207212]: 2025-11-29 06:44:19.065131613 +0000 UTC m=+0.120134014 container exec_died 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:44:19 np0005539504 systemd[1]: libpod-conmon-060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c.scope: Deactivated successfully.
Nov 29 01:44:19 np0005539504 python3.9[207395]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:20 np0005539504 python3.9[207547]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 29 01:44:21 np0005539504 python3.9[207712]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:21 np0005539504 systemd[1]: Started libpod-conmon-1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.scope.
Nov 29 01:44:21 np0005539504 podman[207713]: 2025-11-29 06:44:21.436596894 +0000 UTC m=+0.078932671 container exec 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:44:21 np0005539504 podman[207713]: 2025-11-29 06:44:21.46987793 +0000 UTC m=+0.112213687 container exec_died 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:44:21 np0005539504 systemd[1]: libpod-conmon-1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.scope: Deactivated successfully.
Nov 29 01:44:22 np0005539504 python3.9[207896]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:22 np0005539504 systemd[1]: Started libpod-conmon-1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.scope.
Nov 29 01:44:22 np0005539504 podman[207897]: 2025-11-29 06:44:22.461160832 +0000 UTC m=+0.245481251 container exec 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:44:22 np0005539504 podman[207897]: 2025-11-29 06:44:22.495707534 +0000 UTC m=+0.280027923 container exec_died 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:44:22 np0005539504 systemd[1]: libpod-conmon-1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f.scope: Deactivated successfully.
Nov 29 01:44:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:44:22.898 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:44:22.900 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:44:22.900 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:23 np0005539504 podman[208050]: 2025-11-29 06:44:23.175892876 +0000 UTC m=+0.071906930 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 01:44:23 np0005539504 python3.9[208096]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:24 np0005539504 python3.9[208251]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 29 01:44:25 np0005539504 python3.9[208416]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:25 np0005539504 systemd[1]: Started libpod-conmon-892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.scope.
Nov 29 01:44:25 np0005539504 podman[208417]: 2025-11-29 06:44:25.184948693 +0000 UTC m=+0.081771507 container exec 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm)
Nov 29 01:44:25 np0005539504 podman[208417]: 2025-11-29 06:44:25.222005562 +0000 UTC m=+0.118828346 container exec_died 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, version=9.6, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter)
Nov 29 01:44:25 np0005539504 systemd[1]: libpod-conmon-892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.scope: Deactivated successfully.
Nov 29 01:44:26 np0005539504 python3.9[208600]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:26 np0005539504 systemd[1]: Started libpod-conmon-892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.scope.
Nov 29 01:44:26 np0005539504 podman[208601]: 2025-11-29 06:44:26.176592715 +0000 UTC m=+0.083679227 container exec 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter)
Nov 29 01:44:26 np0005539504 podman[208601]: 2025-11-29 06:44:26.211035305 +0000 UTC m=+0.118121807 container exec_died 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Nov 29 01:44:26 np0005539504 systemd[1]: libpod-conmon-892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4.scope: Deactivated successfully.
Nov 29 01:44:26 np0005539504 python3.9[208782]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:30 np0005539504 nova_compute[187152]: 2025-11-29 06:44:30.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:30 np0005539504 nova_compute[187152]: 2025-11-29 06:44:30.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:30 np0005539504 nova_compute[187152]: 2025-11-29 06:44:30.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:31 np0005539504 podman[208809]: 2025-11-29 06:44:31.770608086 +0000 UTC m=+0.102572967 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:44:31 np0005539504 nova_compute[187152]: 2025-11-29 06:44:31.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:32 np0005539504 nova_compute[187152]: 2025-11-29 06:44:32.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:32 np0005539504 nova_compute[187152]: 2025-11-29 06:44:32.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:44:33 np0005539504 podman[208831]: 2025-11-29 06:44:33.615483616 +0000 UTC m=+0.067284915 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:44:33 np0005539504 nova_compute[187152]: 2025-11-29 06:44:33.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:33 np0005539504 nova_compute[187152]: 2025-11-29 06:44:33.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:44:33 np0005539504 nova_compute[187152]: 2025-11-29 06:44:33.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.313 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.314 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.314 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.411 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.412 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.412 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.412 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.602 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.604 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5994MB free_disk=73.37850570678711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.605 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:34 np0005539504 nova_compute[187152]: 2025-11-29 06:44:34.605 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:34 np0005539504 podman[208855]: 2025-11-29 06:44:34.735819648 +0000 UTC m=+0.080397129 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Nov 29 01:44:35 np0005539504 nova_compute[187152]: 2025-11-29 06:44:35.872 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:44:35 np0005539504 nova_compute[187152]: 2025-11-29 06:44:35.874 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:44:35 np0005539504 nova_compute[187152]: 2025-11-29 06:44:35.955 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:44:37 np0005539504 nova_compute[187152]: 2025-11-29 06:44:37.233 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:44:37 np0005539504 nova_compute[187152]: 2025-11-29 06:44:37.236 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:44:37 np0005539504 nova_compute[187152]: 2025-11-29 06:44:37.237 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:37 np0005539504 nova_compute[187152]: 2025-11-29 06:44:37.860 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:45 np0005539504 podman[208879]: 2025-11-29 06:44:45.569548424 +0000 UTC m=+0.093515113 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:44:46 np0005539504 podman[208906]: 2025-11-29 06:44:46.788457963 +0000 UTC m=+0.132668688 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:44:47.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539504 podman[208931]: 2025-11-29 06:44:48.723834464 +0000 UTC m=+0.066674339 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 29 01:44:53 np0005539504 podman[208951]: 2025-11-29 06:44:53.739025226 +0000 UTC m=+0.076363900 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:44:59 np0005539504 python3.9[209098]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:59 np0005539504 python3.9[209250]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:00 np0005539504 python3.9[209373]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398699.3723128-3211-271936876969728/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:01 np0005539504 python3.9[209525]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:02 np0005539504 podman[209649]: 2025-11-29 06:45:02.04205752 +0000 UTC m=+0.085516388 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:45:02 np0005539504 python3.9[209694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:02 np0005539504 python3.9[209775]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:03 np0005539504 python3.9[209927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:03 np0005539504 podman[209953]: 2025-11-29 06:45:03.729659298 +0000 UTC m=+0.058918590 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:45:03 np0005539504 python3.9[210030]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.y73tbalf recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:04 np0005539504 python3.9[210182]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:05 np0005539504 podman[210232]: 2025-11-29 06:45:05.182513538 +0000 UTC m=+0.094803068 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal)
Nov 29 01:45:05 np0005539504 python3.9[210281]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:06 np0005539504 python3.9[210433]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:07 np0005539504 python3[210586]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:45:08 np0005539504 python3.9[210738]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:08 np0005539504 python3.9[210816]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:09 np0005539504 python3.9[210968]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:09 np0005539504 python3.9[211046]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:10 np0005539504 python3.9[211198]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:11 np0005539504 python3.9[211276]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:11 np0005539504 python3.9[211428]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:12 np0005539504 python3.9[211506]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:13 np0005539504 python3.9[211658]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:13 np0005539504 python3.9[211783]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398712.462801-3586-39748925412188/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:14 np0005539504 python3.9[211935]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:15 np0005539504 python3.9[212087]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:15 np0005539504 podman[212161]: 2025-11-29 06:45:15.814149646 +0000 UTC m=+0.131155298 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 01:45:16 np0005539504 python3.9[212268]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:17 np0005539504 podman[212392]: 2025-11-29 06:45:17.047736261 +0000 UTC m=+0.066150474 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:45:17 np0005539504 python3.9[212438]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:18 np0005539504 python3.9[212599]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:45:18 np0005539504 python3.9[212753]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:19 np0005539504 podman[212781]: 2025-11-29 06:45:19.077801295 +0000 UTC m=+0.069026303 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:45:19 np0005539504 python3.9[212928]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:20 np0005539504 systemd[1]: session-26.scope: Deactivated successfully.
Nov 29 01:45:20 np0005539504 systemd[1]: session-26.scope: Consumed 1min 47.106s CPU time.
Nov 29 01:45:20 np0005539504 systemd-logind[783]: Session 26 logged out. Waiting for processes to exit.
Nov 29 01:45:20 np0005539504 systemd-logind[783]: Removed session 26.
Nov 29 01:45:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:45:22.899 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:45:22.901 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:45:22.902 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:24 np0005539504 podman[212953]: 2025-11-29 06:45:24.754689551 +0000 UTC m=+0.086504504 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:45:31 np0005539504 nova_compute[187152]: 2025-11-29 06:45:31.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:31 np0005539504 nova_compute[187152]: 2025-11-29 06:45:31.940 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:32 np0005539504 podman[212973]: 2025-11-29 06:45:32.80288988 +0000 UTC m=+0.109427054 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 01:45:32 np0005539504 nova_compute[187152]: 2025-11-29 06:45:32.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:32 np0005539504 nova_compute[187152]: 2025-11-29 06:45:32.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539504 podman[212993]: 2025-11-29 06:45:34.731283413 +0000 UTC m=+0.063784549 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:45:34 np0005539504 nova_compute[187152]: 2025-11-29 06:45:34.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539504 nova_compute[187152]: 2025-11-29 06:45:34.973 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539504 nova_compute[187152]: 2025-11-29 06:45:34.973 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539504 nova_compute[187152]: 2025-11-29 06:45:34.973 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:45:35 np0005539504 podman[213017]: 2025-11-29 06:45:35.7318841 +0000 UTC m=+0.070522191 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, distribution-scope=public, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.958 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.998 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.998 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.998 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:35 np0005539504 nova_compute[187152]: 2025-11-29 06:45:35.998 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.178 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.179 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6028MB free_disk=73.37752532958984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.180 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.180 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.250 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.250 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.277 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.292 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.295 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:45:36 np0005539504 nova_compute[187152]: 2025-11-29 06:45:36.295 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:37 np0005539504 nova_compute[187152]: 2025-11-29 06:45:37.274 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:46 np0005539504 podman[213040]: 2025-11-29 06:45:46.778159384 +0000 UTC m=+0.106519337 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:45:47 np0005539504 podman[213066]: 2025-11-29 06:45:47.727923853 +0000 UTC m=+0.065902616 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:45:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:45:47.989 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:45:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:45:47.991 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:45:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:45:47.992 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:45:49 np0005539504 podman[213092]: 2025-11-29 06:45:49.787009656 +0000 UTC m=+0.064880030 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:45:55 np0005539504 podman[213114]: 2025-11-29 06:45:55.730759667 +0000 UTC m=+0.066425521 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Nov 29 01:46:03 np0005539504 podman[213135]: 2025-11-29 06:46:03.712439516 +0000 UTC m=+0.055482084 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 01:46:05 np0005539504 podman[213156]: 2025-11-29 06:46:05.723323772 +0000 UTC m=+0.061305211 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:46:06 np0005539504 podman[213182]: 2025-11-29 06:46:06.725084871 +0000 UTC m=+0.066039710 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 01:46:17 np0005539504 podman[213204]: 2025-11-29 06:46:17.778472556 +0000 UTC m=+0.112193890 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:46:17 np0005539504 podman[213232]: 2025-11-29 06:46:17.841539325 +0000 UTC m=+0.058638679 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:46:20 np0005539504 podman[213257]: 2025-11-29 06:46:20.723419218 +0000 UTC m=+0.070545613 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:46:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:46:22.901 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:46:22.902 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:46:22.902 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:26 np0005539504 podman[213277]: 2025-11-29 06:46:26.731445291 +0000 UTC m=+0.069030502 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:46:30 np0005539504 nova_compute[187152]: 2025-11-29 06:46:30.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:30 np0005539504 nova_compute[187152]: 2025-11-29 06:46:30.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:46:30 np0005539504 nova_compute[187152]: 2025-11-29 06:46:30.973 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:46:30 np0005539504 nova_compute[187152]: 2025-11-29 06:46:30.974 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:30 np0005539504 nova_compute[187152]: 2025-11-29 06:46:30.975 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:46:30 np0005539504 nova_compute[187152]: 2025-11-29 06:46:30.991 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:32 np0005539504 nova_compute[187152]: 2025-11-29 06:46:32.009 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:32 np0005539504 nova_compute[187152]: 2025-11-29 06:46:32.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:33 np0005539504 podman[213299]: 2025-11-29 06:46:33.903532985 +0000 UTC m=+0.062100058 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 01:46:33 np0005539504 nova_compute[187152]: 2025-11-29 06:46:33.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:33 np0005539504 nova_compute[187152]: 2025-11-29 06:46:33.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:34 np0005539504 nova_compute[187152]: 2025-11-29 06:46:34.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:35 np0005539504 nova_compute[187152]: 2025-11-29 06:46:35.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.203 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.203 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.203 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.204 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.387 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.389 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6067MB free_disk=73.37866592407227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.390 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:36 np0005539504 nova_compute[187152]: 2025-11-29 06:46:36.390 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:36 np0005539504 podman[213318]: 2025-11-29 06:46:36.726649539 +0000 UTC m=+0.065493720 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:46:37 np0005539504 podman[213343]: 2025-11-29 06:46:37.708182315 +0000 UTC m=+0.054429220 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.591 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.591 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.653 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.770 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.770 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.797 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.834 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:46:38 np0005539504 nova_compute[187152]: 2025-11-29 06:46:38.862 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:46:40 np0005539504 nova_compute[187152]: 2025-11-29 06:46:40.590 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:46:40 np0005539504 nova_compute[187152]: 2025-11-29 06:46:40.592 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:46:40 np0005539504 nova_compute[187152]: 2025-11-29 06:46:40.592 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:41 np0005539504 nova_compute[187152]: 2025-11-29 06:46:41.593 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:41 np0005539504 nova_compute[187152]: 2025-11-29 06:46:41.594 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:46:41 np0005539504 nova_compute[187152]: 2025-11-29 06:46:41.594 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:46:46 np0005539504 nova_compute[187152]: 2025-11-29 06:46:46.885 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:46:46 np0005539504 nova_compute[187152]: 2025-11-29 06:46:46.885 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:46 np0005539504 nova_compute[187152]: 2025-11-29 06:46:46.887 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:46 np0005539504 nova_compute[187152]: 2025-11-29 06:46:46.888 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.948 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.949 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:46:47.951 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539504 podman[213366]: 2025-11-29 06:46:48.709757387 +0000 UTC m=+0.053207617 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:46:48 np0005539504 podman[213367]: 2025-11-29 06:46:48.79781706 +0000 UTC m=+0.134213799 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:46:51 np0005539504 podman[213414]: 2025-11-29 06:46:51.750167916 +0000 UTC m=+0.081272830 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:46:57 np0005539504 podman[213434]: 2025-11-29 06:46:57.71517917 +0000 UTC m=+0.059488609 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:47:04 np0005539504 podman[213456]: 2025-11-29 06:47:04.72110703 +0000 UTC m=+0.064897504 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:47:07 np0005539504 podman[213477]: 2025-11-29 06:47:07.719478869 +0000 UTC m=+0.061311098 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:47:07 np0005539504 podman[213502]: 2025-11-29 06:47:07.810363589 +0000 UTC m=+0.063651292 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 01:47:19 np0005539504 podman[213524]: 2025-11-29 06:47:19.716387971 +0000 UTC m=+0.061997526 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:47:19 np0005539504 podman[213525]: 2025-11-29 06:47:19.766047161 +0000 UTC m=+0.104483281 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:47:22 np0005539504 podman[213577]: 2025-11-29 06:47:22.369748431 +0000 UTC m=+0.077174868 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:47:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:47:22.902 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:47:22.902 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:47:22.902 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:28 np0005539504 podman[213600]: 2025-11-29 06:47:28.723190889 +0000 UTC m=+0.062149149 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:47:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:47:30.685 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:47:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:47:30.687 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:47:31 np0005539504 nova_compute[187152]: 2025-11-29 06:47:31.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:34 np0005539504 nova_compute[187152]: 2025-11-29 06:47:34.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:34 np0005539504 nova_compute[187152]: 2025-11-29 06:47:34.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:34 np0005539504 nova_compute[187152]: 2025-11-29 06:47:34.982 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:34 np0005539504 nova_compute[187152]: 2025-11-29 06:47:34.982 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:35 np0005539504 podman[213623]: 2025-11-29 06:47:35.762876009 +0000 UTC m=+0.097980103 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:47:36 np0005539504 nova_compute[187152]: 2025-11-29 06:47:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:36 np0005539504 nova_compute[187152]: 2025-11-29 06:47:36.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:36 np0005539504 nova_compute[187152]: 2025-11-29 06:47:36.972 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:36 np0005539504 nova_compute[187152]: 2025-11-29 06:47:36.972 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:36 np0005539504 nova_compute[187152]: 2025-11-29 06:47:36.973 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:36 np0005539504 nova_compute[187152]: 2025-11-29 06:47:36.973 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.116 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.117 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6090MB free_disk=73.37817764282227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.173 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.174 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.189 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.203 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.205 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:47:37 np0005539504 nova_compute[187152]: 2025-11-29 06:47:37.205 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:47:37.691 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.205 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.206 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.206 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.256 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.256 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.256 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:47:38 np0005539504 podman[213643]: 2025-11-29 06:47:38.723010806 +0000 UTC m=+0.061676107 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 29 01:47:38 np0005539504 podman[213642]: 2025-11-29 06:47:38.729451831 +0000 UTC m=+0.073635882 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:47:38 np0005539504 nova_compute[187152]: 2025-11-29 06:47:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:40 np0005539504 nova_compute[187152]: 2025-11-29 06:47:40.408 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:40 np0005539504 nova_compute[187152]: 2025-11-29 06:47:40.408 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:40 np0005539504 nova_compute[187152]: 2025-11-29 06:47:40.446 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.387 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.388 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.396 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.396 187156 INFO nova.compute.claims [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.539 187156 DEBUG nova.compute.provider_tree [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.619 187156 DEBUG nova.scheduler.client.report [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.645 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.646 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.706 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.706 187156 DEBUG nova.network.neutron [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.841 187156 INFO nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:47:41 np0005539504 nova_compute[187152]: 2025-11-29 06:47:41.929 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.259 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.261 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.261 187156 INFO nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Creating image(s)#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.262 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "/var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.262 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "/var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.263 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "/var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.263 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.264 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.665 187156 DEBUG nova.network.neutron [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:47:42 np0005539504 nova_compute[187152]: 2025-11-29 06:47:42.666 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.215 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.279 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.280 187156 DEBUG nova.virt.images [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] 5d270706-931c-4fd1-846d-ba6ddeac2a79 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.282 187156 DEBUG nova.privsep.utils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.283 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.696 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.701 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.754 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.756 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:44 np0005539504 nova_compute[187152]: 2025-11-29 06:47:44.769 187156 INFO oslo.privsep.daemon [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpzth_bny0/privsep.sock']#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.536 187156 INFO oslo.privsep.daemon [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.372 213702 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.377 213702 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.379 213702 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.379 213702 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213702#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.622 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.676 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.677 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.678 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.688 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.740 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.742 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.775 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.777 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.778 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.839 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.841 187156 DEBUG nova.virt.disk.api [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Checking if we can resize image /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.842 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.927 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.928 187156 DEBUG nova.virt.disk.api [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Cannot resize image /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.929 187156 DEBUG nova.objects.instance [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lazy-loading 'migration_context' on Instance uuid 136733f8-a6e6-4c0c-ad22-5f133d73e12e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.949 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.950 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Ensure instance console log exists: /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.950 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.951 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.951 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.953 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.958 187156 WARNING nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.966 187156 DEBUG nova.virt.libvirt.host [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.967 187156 DEBUG nova.virt.libvirt.host [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.971 187156 DEBUG nova.virt.libvirt.host [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.973 187156 DEBUG nova.virt.libvirt.host [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.976 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.976 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.977 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.977 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.977 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.977 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.978 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.978 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.978 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.978 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.979 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.979 187156 DEBUG nova.virt.hardware [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.985 187156 DEBUG nova.privsep.utils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:47:45 np0005539504 nova_compute[187152]: 2025-11-29 06:47:45.987 187156 DEBUG nova.objects.instance [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lazy-loading 'pci_devices' on Instance uuid 136733f8-a6e6-4c0c-ad22-5f133d73e12e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.006 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <uuid>136733f8-a6e6-4c0c-ad22-5f133d73e12e</uuid>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <name>instance-00000001</name>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-1663002252</nova:name>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:47:45</nova:creationTime>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:user uuid="83ba3691dbca4a50a4da77e6c535a80c">tempest-DeleteServersAdminTestJSON-462261432-project-member</nova:user>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:        <nova:project uuid="7a3f0e9267434a37a5deca60e29262ec">tempest-DeleteServersAdminTestJSON-462261432</nova:project>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <entry name="serial">136733f8-a6e6-4c0c-ad22-5f133d73e12e</entry>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <entry name="uuid">136733f8-a6e6-4c0c-ad22-5f133d73e12e</entry>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.config"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/console.log" append="off"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:47:46 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:47:46 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:47:46 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:47:46 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.063 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.064 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.064 187156 INFO nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Using config drive#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.586 187156 INFO nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Creating config drive at /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.config#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.592 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9jzbdvzt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:46 np0005539504 nova_compute[187152]: 2025-11-29 06:47:46.724 187156 DEBUG oslo_concurrency.processutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9jzbdvzt" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:46 np0005539504 systemd-machined[153423]: New machine qemu-1-instance-00000001.
Nov 29 01:47:46 np0005539504 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.086 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.088 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.089 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398867.0858166, 136733f8-a6e6-4c0c-ad22-5f133d73e12e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.090 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.096 187156 INFO nova.virt.libvirt.driver [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Instance spawned successfully.#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.096 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.383 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.387 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.588 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.589 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.589 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.590 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.590 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.590 187156 DEBUG nova.virt.libvirt.driver [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.594 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.594 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398867.087012, 136733f8-a6e6-4c0c-ad22-5f133d73e12e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.595 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] VM Started (Lifecycle Event)#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.634 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.639 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.867 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.894 187156 INFO nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Took 5.63 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:47:47 np0005539504 nova_compute[187152]: 2025-11-29 06:47:47.896 187156 DEBUG nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.111 187156 INFO nova.compute.manager [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Took 7.33 seconds to build instance.#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.158 187156 DEBUG oslo_concurrency.lockutils [None req-061a1f9d-8f8e-4b9e-928f-cbd44dbe60ca 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.654 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Acquiring lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.655 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.656 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Acquiring lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.656 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.656 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.667 187156 INFO nova.compute.manager [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Terminating instance#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.677 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Acquiring lock "refresh_cache-136733f8-a6e6-4c0c-ad22-5f133d73e12e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.678 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Acquired lock "refresh_cache-136733f8-a6e6-4c0c-ad22-5f133d73e12e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:47:48 np0005539504 nova_compute[187152]: 2025-11-29 06:47:48.678 187156 DEBUG nova.network.neutron [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.162 187156 DEBUG nova.network.neutron [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.510 187156 DEBUG nova.network.neutron [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.533 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Releasing lock "refresh_cache-136733f8-a6e6-4c0c-ad22-5f133d73e12e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.534 187156 DEBUG nova.compute.manager [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:47:49 np0005539504 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 29 01:47:49 np0005539504 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 2.774s CPU time.
Nov 29 01:47:49 np0005539504 systemd-machined[153423]: Machine qemu-1-instance-00000001 terminated.
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.791 187156 INFO nova.virt.libvirt.driver [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Instance destroyed successfully.#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.792 187156 DEBUG nova.objects.instance [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lazy-loading 'resources' on Instance uuid 136733f8-a6e6-4c0c-ad22-5f133d73e12e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.832 187156 INFO nova.virt.libvirt.driver [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Deleting instance files /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e_del#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.833 187156 INFO nova.virt.libvirt.driver [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Deletion of /var/lib/nova/instances/136733f8-a6e6-4c0c-ad22-5f133d73e12e_del complete#033[00m
Nov 29 01:47:49 np0005539504 podman[213749]: 2025-11-29 06:47:49.875981154 +0000 UTC m=+0.079162173 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.930 187156 DEBUG nova.virt.libvirt.host [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.931 187156 INFO nova.virt.libvirt.host [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] UEFI support detected#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.933 187156 INFO nova.compute.manager [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.933 187156 DEBUG oslo.service.loopingcall [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.934 187156 DEBUG nova.compute.manager [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:47:49 np0005539504 nova_compute[187152]: 2025-11-29 06:47:49.934 187156 DEBUG nova.network.neutron [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:47:50 np0005539504 podman[213778]: 2025-11-29 06:47:50.008262028 +0000 UTC m=+0.101407616 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.622 187156 DEBUG nova.network.neutron [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.634 187156 DEBUG nova.network.neutron [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.646 187156 INFO nova.compute.manager [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Took 0.71 seconds to deallocate network for instance.#033[00m
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.855 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.856 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.926 187156 DEBUG nova.compute.provider_tree [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:47:50 np0005539504 nova_compute[187152]: 2025-11-29 06:47:50.998 187156 ERROR nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] [req-f6899366-7e8b-43f6-bcab-69854495aef6] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 1c526389-06f6-4ffd-8e90-a84c6c39f0bc.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-f6899366-7e8b-43f6-bcab-69854495aef6"}]}#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.027 187156 DEBUG nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.049 187156 DEBUG nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.049 187156 DEBUG nova.compute.provider_tree [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.074 187156 DEBUG nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.094 187156 DEBUG nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.132 187156 DEBUG nova.compute.provider_tree [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.190 187156 DEBUG nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updated inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.192 187156 DEBUG nova.compute.provider_tree [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updating resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.193 187156 DEBUG nova.compute.provider_tree [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.218 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.248 187156 INFO nova.scheduler.client.report [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Deleted allocations for instance 136733f8-a6e6-4c0c-ad22-5f133d73e12e#033[00m
Nov 29 01:47:51 np0005539504 nova_compute[187152]: 2025-11-29 06:47:51.331 187156 DEBUG oslo_concurrency.lockutils [None req-0e176d7f-128a-4037-9198-bca36598f07c f8e5070ea3f9402bb54ec3bfd256213f f5c31fdbfab04eedb9e90d2a53ff1c33 - - default default] Lock "136733f8-a6e6-4c0c-ad22-5f133d73e12e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:52 np0005539504 podman[213805]: 2025-11-29 06:47:52.7271187 +0000 UTC m=+0.065948514 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:47:53 np0005539504 nova_compute[187152]: 2025-11-29 06:47:53.975 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "8da7b4bd-21cc-42aa-956f-d63624cfe491" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:53 np0005539504 nova_compute[187152]: 2025-11-29 06:47:53.976 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "8da7b4bd-21cc-42aa-956f-d63624cfe491" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.002 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.104 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.105 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.111 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.112 187156 INFO nova.compute.claims [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.234 187156 DEBUG nova.compute.provider_tree [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.253 187156 DEBUG nova.scheduler.client.report [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.280 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.281 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.339 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.340 187156 DEBUG nova.network.neutron [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.355 187156 INFO nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.376 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.528 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.530 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.531 187156 INFO nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Creating image(s)#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.532 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "/var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.532 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "/var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.533 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "/var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.550 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.604 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.605 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.606 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.616 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.682 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.683 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.716 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.717 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.718 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.775 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.777 187156 DEBUG nova.virt.disk.api [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Checking if we can resize image /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.777 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.866 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.868 187156 DEBUG nova.virt.disk.api [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Cannot resize image /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.868 187156 DEBUG nova.objects.instance [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lazy-loading 'migration_context' on Instance uuid 8da7b4bd-21cc-42aa-956f-d63624cfe491 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.887 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.888 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Ensure instance console log exists: /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.889 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.890 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:54 np0005539504 nova_compute[187152]: 2025-11-29 06:47:54.891 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.017 187156 DEBUG nova.network.neutron [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.017 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.020 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.027 187156 WARNING nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.033 187156 DEBUG nova.virt.libvirt.host [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.034 187156 DEBUG nova.virt.libvirt.host [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.037 187156 DEBUG nova.virt.libvirt.host [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.038 187156 DEBUG nova.virt.libvirt.host [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.041 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.041 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.042 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.043 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.043 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.043 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.044 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.044 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.045 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.045 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.046 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.046 187156 DEBUG nova.virt.hardware [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.053 187156 DEBUG nova.objects.instance [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lazy-loading 'pci_devices' on Instance uuid 8da7b4bd-21cc-42aa-956f-d63624cfe491 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.070 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <uuid>8da7b4bd-21cc-42aa-956f-d63624cfe491</uuid>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <name>instance-00000003</name>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-426118704</nova:name>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:47:55</nova:creationTime>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:user uuid="83ba3691dbca4a50a4da77e6c535a80c">tempest-DeleteServersAdminTestJSON-462261432-project-member</nova:user>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:        <nova:project uuid="7a3f0e9267434a37a5deca60e29262ec">tempest-DeleteServersAdminTestJSON-462261432</nova:project>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <entry name="serial">8da7b4bd-21cc-42aa-956f-d63624cfe491</entry>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <entry name="uuid">8da7b4bd-21cc-42aa-956f-d63624cfe491</entry>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.config"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/console.log" append="off"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:47:55 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:47:55 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:47:55 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:47:55 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.129 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.129 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.130 187156 INFO nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Using config drive#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.311 187156 INFO nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Creating config drive at /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.config#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.318 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpif7ms37w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.443 187156 DEBUG oslo_concurrency.processutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpif7ms37w" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:55 np0005539504 systemd-machined[153423]: New machine qemu-2-instance-00000003.
Nov 29 01:47:55 np0005539504 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.785 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398875.7851057, 8da7b4bd-21cc-42aa-956f-d63624cfe491 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.786 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.789 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.790 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.796 187156 INFO nova.virt.libvirt.driver [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Instance spawned successfully.#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.796 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.809 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.818 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.821 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.821 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.822 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.822 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.823 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.823 187156 DEBUG nova.virt.libvirt.driver [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.851 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.851 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398875.7852945, 8da7b4bd-21cc-42aa-956f-d63624cfe491 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.852 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] VM Started (Lifecycle Event)#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.883 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.887 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.907 187156 INFO nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Took 1.38 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.907 187156 DEBUG nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:47:55 np0005539504 nova_compute[187152]: 2025-11-29 06:47:55.910 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:47:56 np0005539504 nova_compute[187152]: 2025-11-29 06:47:56.022 187156 INFO nova.compute.manager [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Took 1.96 seconds to build instance.#033[00m
Nov 29 01:47:56 np0005539504 nova_compute[187152]: 2025-11-29 06:47:56.054 187156 DEBUG oslo_concurrency.lockutils [None req-bf77b70f-2f98-4798-89b6-771a839a3d5a 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "8da7b4bd-21cc-42aa-956f-d63624cfe491" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.795 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "8da7b4bd-21cc-42aa-956f-d63624cfe491" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.796 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "8da7b4bd-21cc-42aa-956f-d63624cfe491" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.796 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "8da7b4bd-21cc-42aa-956f-d63624cfe491-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.797 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "8da7b4bd-21cc-42aa-956f-d63624cfe491-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.797 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "8da7b4bd-21cc-42aa-956f-d63624cfe491-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.808 187156 INFO nova.compute.manager [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Terminating instance#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.820 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "refresh_cache-8da7b4bd-21cc-42aa-956f-d63624cfe491" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.820 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquired lock "refresh_cache-8da7b4bd-21cc-42aa-956f-d63624cfe491" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:47:57 np0005539504 nova_compute[187152]: 2025-11-29 06:47:57.820 187156 DEBUG nova.network.neutron [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:47:58 np0005539504 nova_compute[187152]: 2025-11-29 06:47:58.014 187156 DEBUG nova.network.neutron [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:47:58 np0005539504 nova_compute[187152]: 2025-11-29 06:47:58.707 187156 DEBUG nova.network.neutron [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:47:58 np0005539504 nova_compute[187152]: 2025-11-29 06:47:58.722 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Releasing lock "refresh_cache-8da7b4bd-21cc-42aa-956f-d63624cfe491" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:47:58 np0005539504 nova_compute[187152]: 2025-11-29 06:47:58.722 187156 DEBUG nova.compute.manager [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:47:58 np0005539504 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 29 01:47:58 np0005539504 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 3.234s CPU time.
Nov 29 01:47:58 np0005539504 systemd-machined[153423]: Machine qemu-2-instance-00000003 terminated.
Nov 29 01:47:58 np0005539504 podman[213870]: 2025-11-29 06:47:58.901938394 +0000 UTC m=+0.104736307 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 01:47:58 np0005539504 nova_compute[187152]: 2025-11-29 06:47:58.980 187156 INFO nova.virt.libvirt.driver [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Instance destroyed successfully.#033[00m
Nov 29 01:47:58 np0005539504 nova_compute[187152]: 2025-11-29 06:47:58.980 187156 DEBUG nova.objects.instance [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lazy-loading 'resources' on Instance uuid 8da7b4bd-21cc-42aa-956f-d63624cfe491 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.002 187156 INFO nova.virt.libvirt.driver [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Deleting instance files /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491_del#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.004 187156 INFO nova.virt.libvirt.driver [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Deletion of /var/lib/nova/instances/8da7b4bd-21cc-42aa-956f-d63624cfe491_del complete#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.115 187156 INFO nova.compute.manager [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.116 187156 DEBUG oslo.service.loopingcall [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.117 187156 DEBUG nova.compute.manager [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.117 187156 DEBUG nova.network.neutron [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.371 187156 DEBUG nova.network.neutron [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.390 187156 DEBUG nova.network.neutron [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.420 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.420 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.441 187156 INFO nova.compute.manager [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Took 0.32 seconds to deallocate network for instance.#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.464 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.628 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.629 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.638 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.639 187156 INFO nova.compute.claims [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.689 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.809 187156 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.835 187156 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.872 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.873 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.876 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.962 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.962 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:47:59 np0005539504 nova_compute[187152]: 2025-11-29 06:47:59.992 187156 DEBUG nova.compute.provider_tree [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.016 187156 DEBUG nova.scheduler.client.report [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.024 187156 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.060 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.066 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.131 187156 INFO nova.scheduler.client.report [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Deleted allocations for instance 8da7b4bd-21cc-42aa-956f-d63624cfe491#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.247 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.248 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.248 187156 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Creating image(s)#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.249 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "/var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.249 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.250 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.264 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.292 187156 DEBUG oslo_concurrency.lockutils [None req-05b57cef-3d54-4361-88d2-52009a125f6c 83ba3691dbca4a50a4da77e6c535a80c 7a3f0e9267434a37a5deca60e29262ec - - default default] Lock "8da7b4bd-21cc-42aa-956f-d63624cfe491" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.326 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.327 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.328 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.339 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.357 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Automatically allocating a network for project 6d2e7db012114f9eb8e8e1b0123c9974. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.399 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.400 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.432 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.434 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.435 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.495 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.496 187156 DEBUG nova.virt.disk.api [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Checking if we can resize image /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.497 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.582 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.583 187156 DEBUG nova.virt.disk.api [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Cannot resize image /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.584 187156 DEBUG nova.objects.instance [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'migration_context' on Instance uuid 4a10d3d0-27cb-4116-9924-cf8baaec591d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.596 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.597 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Ensure instance console log exists: /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.597 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.597 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:00 np0005539504 nova_compute[187152]: 2025-11-29 06:48:00.598 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:04 np0005539504 nova_compute[187152]: 2025-11-29 06:48:04.790 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398869.7886055, 136733f8-a6e6-4c0c-ad22-5f133d73e12e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:04 np0005539504 nova_compute[187152]: 2025-11-29 06:48:04.791 187156 INFO nova.compute.manager [-] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:48:04 np0005539504 nova_compute[187152]: 2025-11-29 06:48:04.841 187156 DEBUG nova.compute.manager [None req-7ce1bc31-2360-4093-bfe2-149b4856f9a1 - - - - - -] [instance: 136733f8-a6e6-4c0c-ad22-5f133d73e12e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:06 np0005539504 podman[213914]: 2025-11-29 06:48:06.713698136 +0000 UTC m=+0.056012914 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 01:48:09 np0005539504 podman[213936]: 2025-11-29 06:48:09.714467068 +0000 UTC m=+0.055968813 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:48:09 np0005539504 podman[213937]: 2025-11-29 06:48:09.727091151 +0000 UTC m=+0.064643358 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.openshift.expose-services=, release=1755695350)
Nov 29 01:48:09 np0005539504 nova_compute[187152]: 2025-11-29 06:48:09.964 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Automatically allocated network: {'id': '425e933e-ca72-466c-8d2b-499c7ba67318', 'name': 'auto_allocated_network', 'tenant_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['89838152-b4b2-434b-a7d9-d3f897cb4399', 'a56d2d79-817f-461e-9014-0136415cc45e'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-11-29T06:48:00Z', 'updated_at': '2025-11-29T06:48:09Z', 'revision_number': 4, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Nov 29 01:48:09 np0005539504 nova_compute[187152]: 2025-11-29 06:48:09.985 187156 WARNING oslo_policy.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 29 01:48:09 np0005539504 nova_compute[187152]: 2025-11-29 06:48:09.987 187156 WARNING oslo_policy.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 29 01:48:09 np0005539504 nova_compute[187152]: 2025-11-29 06:48:09.990 187156 DEBUG nova.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:48:11 np0005539504 nova_compute[187152]: 2025-11-29 06:48:11.172 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Successfully created port: 52732bd8-180c-4935-84b3-9f7f3e46c276 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.146 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Successfully updated port: 52732bd8-180c-4935-84b3-9f7f3e46c276 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.171 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.171 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquired lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.171 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.500 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.703 187156 DEBUG nova.compute.manager [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-changed-52732bd8-180c-4935-84b3-9f7f3e46c276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.703 187156 DEBUG nova.compute.manager [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Refreshing instance network info cache due to event network-changed-52732bd8-180c-4935-84b3-9f7f3e46c276. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:48:12 np0005539504 nova_compute[187152]: 2025-11-29 06:48:12.704 187156 DEBUG oslo_concurrency.lockutils [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:13 np0005539504 nova_compute[187152]: 2025-11-29 06:48:13.978 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398878.975922, 8da7b4bd-21cc-42aa-956f-d63624cfe491 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:13 np0005539504 nova_compute[187152]: 2025-11-29 06:48:13.979 187156 INFO nova.compute.manager [-] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.004 187156 DEBUG nova.compute.manager [None req-c14c3292-e9d5-45c9-ab27-4f3829d65ab3 - - - - - -] [instance: 8da7b4bd-21cc-42aa-956f-d63624cfe491] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.262 187156 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Updating instance_info_cache with network_info: [{"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.300 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Releasing lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.300 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Instance network_info: |[{"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.301 187156 DEBUG oslo_concurrency.lockutils [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.302 187156 DEBUG nova.network.neutron [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Refreshing network info cache for port 52732bd8-180c-4935-84b3-9f7f3e46c276 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.305 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Start _get_guest_xml network_info=[{"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.310 187156 WARNING nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.327 187156 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.328 187156 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.338 187156 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.339 187156 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.340 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.340 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.341 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.341 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.341 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.342 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.342 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.342 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.342 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.342 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.343 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.343 187156 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.347 187156 DEBUG nova.virt.libvirt.vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-2',id=5,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:00Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=4a10d3d0-27cb-4116-9924-cf8baaec591d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.347 187156 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.349 187156 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.351 187156 DEBUG nova.objects.instance [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4a10d3d0-27cb-4116-9924-cf8baaec591d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.370 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <uuid>4a10d3d0-27cb-4116-9924-cf8baaec591d</uuid>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <name>instance-00000005</name>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:name>tempest-tempest.common.compute-instance-526752650-2</nova:name>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:48:14</nova:creationTime>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:user uuid="7a31c969c2f744a9810fc9890dd7acb2">tempest-AutoAllocateNetworkTest-224859463-project-member</nova:user>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:project uuid="6d2e7db012114f9eb8e8e1b0123c9974">tempest-AutoAllocateNetworkTest-224859463</nova:project>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        <nova:port uuid="52732bd8-180c-4935-84b3-9f7f3e46c276">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="fdfe:381f:8400::306" ipVersion="6"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.1.0.7" ipVersion="4"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <entry name="serial">4a10d3d0-27cb-4116-9924-cf8baaec591d</entry>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <entry name="uuid">4a10d3d0-27cb-4116-9924-cf8baaec591d</entry>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.config"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:c2:e1:81"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <target dev="tap52732bd8-18"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/console.log" append="off"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:48:14 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:48:14 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:48:14 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:48:14 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.371 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Preparing to wait for external event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.371 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.371 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.372 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.372 187156 DEBUG nova.virt.libvirt.vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-2',id=5,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:00Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=4a10d3d0-27cb-4116-9924-cf8baaec591d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.373 187156 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.373 187156 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.374 187156 DEBUG os_vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.414 187156 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.415 187156 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.415 187156 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.416 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.416 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLOUT] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.416 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.417 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.418 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.420 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.429 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.429 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.429 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:14 np0005539504 nova_compute[187152]: 2025-11-29 06:48:14.430 187156 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpihrpskya/privsep.sock']#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.157 187156 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.001 213984 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.008 213984 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.012 213984 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.012 213984 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213984#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.526 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.527 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52732bd8-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.528 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap52732bd8-18, col_values=(('external_ids', {'iface-id': '52732bd8-180c-4935-84b3-9f7f3e46c276', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:e1:81', 'vm-uuid': '4a10d3d0-27cb-4116-9924-cf8baaec591d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.531 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:15 np0005539504 NetworkManager[55210]: <info>  [1764398895.5328] manager: (tap52732bd8-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.539 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.541 187156 INFO os_vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18')#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.625 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.626 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.626 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No VIF found with MAC fa:16:3e:c2:e1:81, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:48:15 np0005539504 nova_compute[187152]: 2025-11-29 06:48:15.627 187156 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Using config drive#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.187 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.188 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.202 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.310 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.311 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.319 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.319 187156 INFO nova.compute.claims [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.345 187156 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Creating config drive at /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.config#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.355 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppapyw9xe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.485 187156 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppapyw9xe" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.500 187156 DEBUG nova.compute.provider_tree [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.504 187156 DEBUG nova.network.neutron [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Updated VIF entry in instance network info cache for port 52732bd8-180c-4935-84b3-9f7f3e46c276. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.504 187156 DEBUG nova.network.neutron [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Updating instance_info_cache with network_info: [{"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:16 np0005539504 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 29 01:48:16 np0005539504 kernel: tap52732bd8-18: entered promiscuous mode
Nov 29 01:48:16 np0005539504 NetworkManager[55210]: <info>  [1764398896.5419] manager: (tap52732bd8-18): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 01:48:16 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:16Z|00027|binding|INFO|Claiming lport 52732bd8-180c-4935-84b3-9f7f3e46c276 for this chassis.
Nov 29 01:48:16 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:16Z|00028|binding|INFO|52732bd8-180c-4935-84b3-9f7f3e46c276: Claiming fa:16:3e:c2:e1:81 10.1.0.7 fdfe:381f:8400::306
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.543 187156 DEBUG nova.scheduler.client.report [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.548 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.555 187156 DEBUG oslo_concurrency.lockutils [req-f6bd1f9c-b0b0-4581-a552-2a10df7d09bc req-0237087c-2620-4aeb-a8c3-c6fa95358572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:16 np0005539504 systemd-udevd[214011]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.585 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.586 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:48:16 np0005539504 NetworkManager[55210]: <info>  [1764398896.5951] device (tap52732bd8-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:48:16 np0005539504 NetworkManager[55210]: <info>  [1764398896.5976] device (tap52732bd8-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:48:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:16.599 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:e1:81 10.1.0.7 fdfe:381f:8400::306'], port_security=['fa:16:3e:c2:e1:81 10.1.0.7 fdfe:381f:8400::306'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.7/26 fdfe:381f:8400::306/64', 'neutron:device_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-425e933e-ca72-466c-8d2b-499c7ba67318', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5cacaa01-dff2-46af-9e49-4a741508795b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=236265de-856a-468e-8ed3-00d3e824203d, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=52732bd8-180c-4935-84b3-9f7f3e46c276) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:48:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:16.600 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 52732bd8-180c-4935-84b3-9f7f3e46c276 in datapath 425e933e-ca72-466c-8d2b-499c7ba67318 bound to our chassis#033[00m
Nov 29 01:48:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:16.603 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 425e933e-ca72-466c-8d2b-499c7ba67318#033[00m
Nov 29 01:48:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:16.604 104164 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpgy3jkn59/privsep.sock']#033[00m
Nov 29 01:48:16 np0005539504 systemd-machined[153423]: New machine qemu-3-instance-00000005.
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.614 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:16 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:16Z|00029|binding|INFO|Setting lport 52732bd8-180c-4935-84b3-9f7f3e46c276 ovn-installed in OVS
Nov 29 01:48:16 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:16Z|00030|binding|INFO|Setting lport 52732bd8-180c-4935-84b3-9f7f3e46c276 up in Southbound
Nov 29 01:48:16 np0005539504 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.623 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.645 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.646 187156 DEBUG nova.network.neutron [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.671 187156 INFO nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.688 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.810 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.812 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.812 187156 INFO nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating image(s)#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.813 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.813 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.814 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.834 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.851 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398896.8424804, 4a10d3d0-27cb-4116-9924-cf8baaec591d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.851 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] VM Started (Lifecycle Event)#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.875 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.882 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398896.8435433, 4a10d3d0-27cb-4116-9924-cf8baaec591d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.883 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.896 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.897 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.898 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.909 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.926 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.930 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.949 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.963 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.964 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.998 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.999 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:16 np0005539504 nova_compute[187152]: 2025-11-29 06:48:16.999 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.061 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.062 187156 DEBUG nova.virt.disk.api [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Checking if we can resize image /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.062 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.093 187156 DEBUG nova.policy [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.149 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.150 187156 DEBUG nova.virt.disk.api [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Cannot resize image /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.150 187156 DEBUG nova.objects.instance [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lazy-loading 'migration_context' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.366 104164 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.367 104164 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgy3jkn59/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.203 214051 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.210 214051 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.213 214051 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.213 214051 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214051#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.370 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[46ce69e3-3103-4dd7-b208-8059c88cbded]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.787 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.788 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Ensure instance console log exists: /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.789 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.790 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:17 np0005539504 nova_compute[187152]: 2025-11-29 06:48:17.790 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.981 214051 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.981 214051 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:17.981 214051 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:18 np0005539504 nova_compute[187152]: 2025-11-29 06:48:18.332 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.717 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5daa317e-512e-46b6-b034-4e95b3ac19cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.719 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap425e933e-c1 in ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.721 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap425e933e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.721 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab5e211-3265-46a0-8982-bf71b0cb7a8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.724 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fb3b43fb-c3d1-4ef5-b5a7-2817e4d02b4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.748 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3f963354-5f19-4cc6-8fcc-75847d2953cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.761 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0364f67e-4d72-45f8-9c7e-f4542f6b0c14]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:18.763 104164 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp497arics/privsep.sock']#033[00m
Nov 29 01:48:18 np0005539504 nova_compute[187152]: 2025-11-29 06:48:18.803 187156 DEBUG nova.network.neutron [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Successfully created port: 60d45f94-ad4f-48ba-a0a9-6b5406aa616c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.813 104164 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.814 104164 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp497arics/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.489 214065 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.567 214065 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.569 214065 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.569 214065 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214065#033[00m
Nov 29 01:48:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:19.818 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1988d1ce-358e-4e17-9da2-c2d88c4f4983]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:20 np0005539504 nova_compute[187152]: 2025-11-29 06:48:20.533 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:20.575 214065 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:20.576 214065 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:20.576 214065 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:20 np0005539504 nova_compute[187152]: 2025-11-29 06:48:20.721 187156 DEBUG nova.network.neutron [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Successfully updated port: 60d45f94-ad4f-48ba-a0a9-6b5406aa616c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:48:20 np0005539504 nova_compute[187152]: 2025-11-29 06:48:20.750 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:20 np0005539504 nova_compute[187152]: 2025-11-29 06:48:20.751 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:20 np0005539504 nova_compute[187152]: 2025-11-29 06:48:20.751 187156 DEBUG nova.network.neutron [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:48:20 np0005539504 podman[214071]: 2025-11-29 06:48:20.799284912 +0000 UTC m=+0.135706489 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:48:20 np0005539504 podman[214070]: 2025-11-29 06:48:20.801957175 +0000 UTC m=+0.139585964 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.092 187156 DEBUG nova.compute.manager [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-changed-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.092 187156 DEBUG nova.compute.manager [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Refreshing instance network info cache due to event network-changed-60d45f94-ad4f-48ba-a0a9-6b5406aa616c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.093 187156 DEBUG oslo_concurrency.lockutils [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.309 187156 DEBUG nova.network.neutron [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.315 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8ddfdb56-84a2-4c39-9910-d1f02d8b6820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.348 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e5315b-5312-4b51-b1d0-102784efe014]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 NetworkManager[55210]: <info>  [1764398901.3506] manager: (tap425e933e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 29 01:48:21 np0005539504 systemd-udevd[214121]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.390 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f555b623-b732-472b-9a38-c0041b9f7b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.394 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3b48cfc4-1b64-454a-9d94-b2dc6f8dba4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 NetworkManager[55210]: <info>  [1764398901.4298] device (tap425e933e-c0): carrier: link connected
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.439 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdafaf9-186f-44fc-8d30-059370f9d540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.462 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[66846a4d-2a62-4272-b805-cb8401f44b7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap425e933e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:d2:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436430, 'reachable_time': 32584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214139, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.491 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[273e56a2-d19b-46e7-a857-7c2dcff9e078]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe96:d291'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436430, 'tstamp': 436430}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214140, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.506 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f55f0299-b6f3-4991-89c5-45256b341873]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap425e933e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:d2:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436430, 'reachable_time': 32584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214141, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.541 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbebb5b-fbc4-4509-ba73-d2c29af0c29f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.594 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[95c608d6-af79-4dbb-bff6-dbee4c125d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.597 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap425e933e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.598 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.599 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap425e933e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:21 np0005539504 NetworkManager[55210]: <info>  [1764398901.6034] manager: (tap425e933e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 29 01:48:21 np0005539504 kernel: tap425e933e-c0: entered promiscuous mode
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.603 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.608 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.609 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap425e933e-c0, col_values=(('external_ids', {'iface-id': 'c143daec-964e-4591-a13b-43e2014d70b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:21 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:21Z|00031|binding|INFO|Releasing lport c143daec-964e-4591-a13b-43e2014d70b5 from this chassis (sb_readonly=0)
Nov 29 01:48:21 np0005539504 nova_compute[187152]: 2025-11-29 06:48:21.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.640 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.641 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[031d4e5e-1198-4d40-a9a6-534f8d02f75f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.643 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:48:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:21.644 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'env', 'PROCESS_TAG=haproxy-425e933e-ca72-466c-8d2b-499c7ba67318', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/425e933e-ca72-466c-8d2b-499c7ba67318.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.137 187156 DEBUG nova.network.neutron [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.157 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.158 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance network_info: |[{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.158 187156 DEBUG oslo_concurrency.lockutils [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.158 187156 DEBUG nova.network.neutron [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Refreshing network info cache for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.160 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Start _get_guest_xml network_info=[{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.165 187156 WARNING nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.170 187156 DEBUG nova.virt.libvirt.host [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.170 187156 DEBUG nova.virt.libvirt.host [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.175 187156 DEBUG nova.virt.libvirt.host [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.175 187156 DEBUG nova.virt.libvirt.host [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.176 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.176 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.177 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.177 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.177 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.177 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.178 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.178 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.178 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.178 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.178 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.179 187156 DEBUG nova.virt.hardware [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.183 187156 DEBUG nova.virt.libvirt.vif [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:16Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.183 187156 DEBUG nova.network.os_vif_util [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.184 187156 DEBUG nova.network.os_vif_util [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.185 187156 DEBUG nova.objects.instance [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lazy-loading 'pci_devices' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.200 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <uuid>af865d23-0f24-47aa-aeab-1c12d04b5a1e</uuid>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <name>instance-00000007</name>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:name>tempest-LiveMigrationTest-server-1791593514</nova:name>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:48:22</nova:creationTime>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:user uuid="a01fd01629a1493bb3fb6df5a2462226">tempest-LiveMigrationTest-440211682-project-member</nova:user>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:project uuid="2b6eb92d93c24eaaa0c6a3104a54633a">tempest-LiveMigrationTest-440211682</nova:project>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        <nova:port uuid="60d45f94-ad4f-48ba-a0a9-6b5406aa616c">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <entry name="serial">af865d23-0f24-47aa-aeab-1c12d04b5a1e</entry>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <entry name="uuid">af865d23-0f24-47aa-aeab-1c12d04b5a1e</entry>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:86:09:58"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <target dev="tap60d45f94-ad"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/console.log" append="off"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:48:22 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:48:22 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:48:22 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:48:22 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.202 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Preparing to wait for external event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.202 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.202 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.202 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.203 187156 DEBUG nova.virt.libvirt.vif [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:16Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.203 187156 DEBUG nova.network.os_vif_util [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.204 187156 DEBUG nova.network.os_vif_util [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.204 187156 DEBUG os_vif [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.205 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.205 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.205 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.219 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.219 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60d45f94-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.220 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60d45f94-ad, col_values=(('external_ids', {'iface-id': '60d45f94-ad4f-48ba-a0a9-6b5406aa616c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:09:58', 'vm-uuid': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:22 np0005539504 NetworkManager[55210]: <info>  [1764398902.2691] manager: (tap60d45f94-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.268 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.271 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:22 np0005539504 nova_compute[187152]: 2025-11-29 06:48:22.275 187156 INFO os_vif [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad')#033[00m
Nov 29 01:48:22 np0005539504 podman[214176]: 2025-11-29 06:48:22.364224873 +0000 UTC m=+0.032198496 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:48:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:22.903 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:22.904 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:22.905 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:23 np0005539504 podman[214176]: 2025-11-29 06:48:23.23724436 +0000 UTC m=+0.905217993 container create 0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 01:48:23 np0005539504 nova_compute[187152]: 2025-11-29 06:48:23.391 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:23 np0005539504 systemd[1]: Started libpod-conmon-0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024.scope.
Nov 29 01:48:23 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:48:23 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac2c284a62f6d93bf6fc3f34a1ff10a0a83b78cb77356cd528fd2b36228e8df8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:23 np0005539504 podman[214176]: 2025-11-29 06:48:23.583578922 +0000 UTC m=+1.251552535 container init 0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:48:23 np0005539504 podman[214176]: 2025-11-29 06:48:23.593477381 +0000 UTC m=+1.261451004 container start 0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 01:48:23 np0005539504 podman[214190]: 2025-11-29 06:48:23.593992095 +0000 UTC m=+0.223675869 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:48:23 np0005539504 nova_compute[187152]: 2025-11-29 06:48:23.596 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:48:23 np0005539504 nova_compute[187152]: 2025-11-29 06:48:23.597 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:48:23 np0005539504 nova_compute[187152]: 2025-11-29 06:48:23.597 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] No VIF found with MAC fa:16:3e:86:09:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:48:23 np0005539504 nova_compute[187152]: 2025-11-29 06:48:23.598 187156 INFO nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Using config drive#033[00m
Nov 29 01:48:23 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [NOTICE]   (214217) : New worker (214219) forked
Nov 29 01:48:23 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [NOTICE]   (214217) : Loading success.
Nov 29 01:48:24 np0005539504 nova_compute[187152]: 2025-11-29 06:48:24.849 187156 INFO nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating config drive at /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config#033[00m
Nov 29 01:48:24 np0005539504 nova_compute[187152]: 2025-11-29 06:48:24.864 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqz55yopm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.001 187156 DEBUG oslo_concurrency.processutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqz55yopm" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:25 np0005539504 kernel: tap60d45f94-ad: entered promiscuous mode
Nov 29 01:48:25 np0005539504 NetworkManager[55210]: <info>  [1764398905.0774] manager: (tap60d45f94-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Nov 29 01:48:25 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:25Z|00032|binding|INFO|Claiming lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c for this chassis.
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.080 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:25Z|00033|binding|INFO|60d45f94-ad4f-48ba-a0a9-6b5406aa616c: Claiming fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:48:25 np0005539504 systemd-udevd[214245]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:48:25 np0005539504 systemd-machined[153423]: New machine qemu-4-instance-00000007.
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.123 187156 DEBUG nova.network.neutron [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updated VIF entry in instance network info cache for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.123 187156 DEBUG nova.network.neutron [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:25 np0005539504 NetworkManager[55210]: <info>  [1764398905.1261] device (tap60d45f94-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:48:25 np0005539504 NetworkManager[55210]: <info>  [1764398905.1270] device (tap60d45f94-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.156 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:09:58 10.100.0.4'], port_security=['fa:16:3e:86:09:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60d45f94-ad4f-48ba-a0a9-6b5406aa616c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.158 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 bound to our chassis#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.159 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.161 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:25Z|00034|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c ovn-installed in OVS
Nov 29 01:48:25 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:25Z|00035|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c up in Southbound
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.164 187156 DEBUG oslo_concurrency.lockutils [req-4794636d-4465-4f56-9645-3121d65e0c21 req-fbc08923-2cdd-4869-80cf-26719d44fbaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.173 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[efc8abb2-2b64-4d5b-9ef8-89a02a4c69bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.175 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24ee44f0-21 in ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.176 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24ee44f0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.177 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0a2b5919-fd14-486d-8940-07fa1304b927]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.178 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[acecd256-33a9-4318-aecc-d5d64aaaeb59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.205 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c05e00-d9e5-4dd2-9121-fb6b24846c33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.219 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8fff15-95f7-4657-be79-1f8e4addb98d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.250 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae6e5f5-e245-49f2-b35b-ed8f317887d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 NetworkManager[55210]: <info>  [1764398905.2575] manager: (tap24ee44f0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.257 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[33e8adaa-64f5-45e7-bd41-b54322c662a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.295 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbdaeb3-7141-4522-abdb-65f4b47157c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.299 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c706ed-a49f-45bb-bf0e-0cddc8ba747f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 NetworkManager[55210]: <info>  [1764398905.3280] device (tap24ee44f0-20): carrier: link connected
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.342 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[34316a6d-2e3c-40f0-87f2-8e5e3bfc3721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.368 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f87bfce2-f6c2-4384-9220-67e276eed3e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436820, 'reachable_time': 22715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214280, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.388 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8cdf0a-c4e1-4cc7-a131-bc761d423aec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:940c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436820, 'tstamp': 436820}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214281, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.407 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[023b4da1-715a-46e2-86a7-a1b6113c041c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436820, 'reachable_time': 22715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214282, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.441 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c7c29b-fac6-4d2b-834d-2919b138bfec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.500 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c13f20c-16dd-4c68-9ae7-40c899f9c143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.504 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.505 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.506 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24ee44f0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.542 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 kernel: tap24ee44f0-20: entered promiscuous mode
Nov 29 01:48:25 np0005539504 NetworkManager[55210]: <info>  [1764398905.5494] manager: (tap24ee44f0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.549 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.550 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24ee44f0-20, col_values=(('external_ids', {'iface-id': 'ffbd3b8f-7e45-45d4-84ce-cd74c712f992'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.551 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:25Z|00036|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.554 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.554 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.563 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[37480b91-7b54-4592-a7ec-4c0db30e0cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.564 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.566 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:25.567 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'env', 'PROCESS_TAG=haproxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.694 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398905.6937852, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.695 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Started (Lifecycle Event)#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.756 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.761 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398905.6951864, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.762 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.786 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.790 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:25 np0005539504 nova_compute[187152]: 2025-11-29 06:48:25.829 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.165 187156 DEBUG nova.compute.manager [req-ade7ac94-6a2a-4806-8990-191f1d7a1a07 req-afa5e091-6d22-4631-82e4-ce0a4a7db0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.166 187156 DEBUG oslo_concurrency.lockutils [req-ade7ac94-6a2a-4806-8990-191f1d7a1a07 req-afa5e091-6d22-4631-82e4-ce0a4a7db0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.167 187156 DEBUG oslo_concurrency.lockutils [req-ade7ac94-6a2a-4806-8990-191f1d7a1a07 req-afa5e091-6d22-4631-82e4-ce0a4a7db0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.167 187156 DEBUG oslo_concurrency.lockutils [req-ade7ac94-6a2a-4806-8990-191f1d7a1a07 req-afa5e091-6d22-4631-82e4-ce0a4a7db0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.168 187156 DEBUG nova.compute.manager [req-ade7ac94-6a2a-4806-8990-191f1d7a1a07 req-afa5e091-6d22-4631-82e4-ce0a4a7db0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Processing event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.169 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.204 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398906.204392, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.205 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.207 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.210 187156 INFO nova.virt.libvirt.driver [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance spawned successfully.#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.210 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:48:26 np0005539504 podman[214321]: 2025-11-29 06:48:26.225556844 +0000 UTC m=+0.076579103 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:48:26 np0005539504 podman[214321]: 2025-11-29 06:48:26.477511531 +0000 UTC m=+0.328533780 container create 0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.499 187156 DEBUG nova.compute.manager [req-e2a81134-28e8-47a1-b936-9da69d2bbcb0 req-d1df6582-680a-4459-b90f-4769791c917c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.501 187156 DEBUG oslo_concurrency.lockutils [req-e2a81134-28e8-47a1-b936-9da69d2bbcb0 req-d1df6582-680a-4459-b90f-4769791c917c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.502 187156 DEBUG oslo_concurrency.lockutils [req-e2a81134-28e8-47a1-b936-9da69d2bbcb0 req-d1df6582-680a-4459-b90f-4769791c917c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.502 187156 DEBUG oslo_concurrency.lockutils [req-e2a81134-28e8-47a1-b936-9da69d2bbcb0 req-d1df6582-680a-4459-b90f-4769791c917c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.502 187156 DEBUG nova.compute.manager [req-e2a81134-28e8-47a1-b936-9da69d2bbcb0 req-d1df6582-680a-4459-b90f-4769791c917c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Processing event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.504 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.505 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.513 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.516 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.517 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.517 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.518 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.518 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.519 187156 DEBUG nova.virt.libvirt.driver [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.525 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:26 np0005539504 systemd[1]: Started libpod-conmon-0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08.scope.
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.530 187156 INFO nova.virt.libvirt.driver [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Instance spawned successfully.#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.531 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:48:26 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:48:26 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9fc508bd8a31fea3c3d1c47ca53c97d987b9e12ad15c5cb27f0b4c6e1e0a5345/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.574 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.575 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398906.5133867, 4a10d3d0-27cb-4116-9924-cf8baaec591d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.575 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:48:26 np0005539504 podman[214321]: 2025-11-29 06:48:26.685616056 +0000 UTC m=+0.536638315 container init 0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:26 np0005539504 podman[214321]: 2025-11-29 06:48:26.702027493 +0000 UTC m=+0.553049732 container start 0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:48:26 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [NOTICE]   (214341) : New worker (214343) forked
Nov 29 01:48:26 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [NOTICE]   (214341) : Loading success.
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.797 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.805 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.809 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.810 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.811 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.811 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.812 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.814 187156 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.963 187156 INFO nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Took 10.15 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.964 187156 DEBUG nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:26 np0005539504 nova_compute[187152]: 2025-11-29 06:48:26.966 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.137 187156 INFO nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Took 26.89 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.138 187156 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.268 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.437 187156 INFO nova.compute.manager [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Took 11.16 seconds to build instance.#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.463 187156 DEBUG oslo_concurrency.lockutils [None req-c4fd6bc5-9571-42e7-b988-b57deb95c2cf a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.473 187156 INFO nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Took 27.89 seconds to build instance.#033[00m
Nov 29 01:48:27 np0005539504 nova_compute[187152]: 2025-11-29 06:48:27.494 187156 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.438 187156 DEBUG nova.compute.manager [req-04d2e0cf-f94d-4a80-9bfe-e7053a7d00ae req-f5084212-ad5a-4cd4-ac06-eaf900040ae4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.439 187156 DEBUG oslo_concurrency.lockutils [req-04d2e0cf-f94d-4a80-9bfe-e7053a7d00ae req-f5084212-ad5a-4cd4-ac06-eaf900040ae4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.439 187156 DEBUG oslo_concurrency.lockutils [req-04d2e0cf-f94d-4a80-9bfe-e7053a7d00ae req-f5084212-ad5a-4cd4-ac06-eaf900040ae4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.440 187156 DEBUG oslo_concurrency.lockutils [req-04d2e0cf-f94d-4a80-9bfe-e7053a7d00ae req-f5084212-ad5a-4cd4-ac06-eaf900040ae4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.440 187156 DEBUG nova.compute.manager [req-04d2e0cf-f94d-4a80-9bfe-e7053a7d00ae req-f5084212-ad5a-4cd4-ac06-eaf900040ae4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.440 187156 WARNING nova.compute.manager [req-04d2e0cf-f94d-4a80-9bfe-e7053a7d00ae req-f5084212-ad5a-4cd4-ac06-eaf900040ae4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state None.#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.626 187156 DEBUG nova.compute.manager [req-4eb0b927-0e14-461b-8233-ce0f68fbee32 req-574b077b-afbe-4ad6-b868-d60a2e91c343 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.627 187156 DEBUG oslo_concurrency.lockutils [req-4eb0b927-0e14-461b-8233-ce0f68fbee32 req-574b077b-afbe-4ad6-b868-d60a2e91c343 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.627 187156 DEBUG oslo_concurrency.lockutils [req-4eb0b927-0e14-461b-8233-ce0f68fbee32 req-574b077b-afbe-4ad6-b868-d60a2e91c343 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.628 187156 DEBUG oslo_concurrency.lockutils [req-4eb0b927-0e14-461b-8233-ce0f68fbee32 req-574b077b-afbe-4ad6-b868-d60a2e91c343 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.628 187156 DEBUG nova.compute.manager [req-4eb0b927-0e14-461b-8233-ce0f68fbee32 req-574b077b-afbe-4ad6-b868-d60a2e91c343 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] No waiting events found dispatching network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:48:28 np0005539504 nova_compute[187152]: 2025-11-29 06:48:28.628 187156 WARNING nova.compute.manager [req-4eb0b927-0e14-461b-8233-ce0f68fbee32 req-574b077b-afbe-4ad6-b868-d60a2e91c343 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received unexpected event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:48:29 np0005539504 podman[214353]: 2025-11-29 06:48:29.725094451 +0000 UTC m=+0.068429241 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:48:31 np0005539504 nova_compute[187152]: 2025-11-29 06:48:31.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:32.197 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:48:32 np0005539504 nova_compute[187152]: 2025-11-29 06:48:32.197 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:32.199 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:48:32 np0005539504 nova_compute[187152]: 2025-11-29 06:48:32.271 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:33.202 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:33 np0005539504 nova_compute[187152]: 2025-11-29 06:48:33.396 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:34 np0005539504 nova_compute[187152]: 2025-11-29 06:48:34.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:34 np0005539504 nova_compute[187152]: 2025-11-29 06:48:34.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:35 np0005539504 nova_compute[187152]: 2025-11-29 06:48:35.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.634 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Check if temp file /var/lib/nova/instances/tmpkcenheiq exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.643 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.720 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.721 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.783 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.785 187156 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 29 01:48:36 np0005539504 nova_compute[187152]: 2025-11-29 06:48:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:37 np0005539504 nova_compute[187152]: 2025-11-29 06:48:37.274 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:37 np0005539504 podman[214380]: 2025-11-29 06:48:37.763124582 +0000 UTC m=+0.074368462 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:48:37 np0005539504 nova_compute[187152]: 2025-11-29 06:48:37.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.398 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.584 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.585 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.585 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.585 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.763 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.833 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.835 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.898 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.905 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.986 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:38 np0005539504 nova_compute[187152]: 2025-11-29 06:48:38.987 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.058 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.231 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.234 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5449MB free_disk=73.34215927124023GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.234 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.234 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.629 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 4a10d3d0-27cb-4116-9924-cf8baaec591d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.845 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance e8267ed6-ce75-49c9-85a6-d08b827f6aea has allocations against this compute host but is not found in the database.#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.846 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:48:39 np0005539504 nova_compute[187152]: 2025-11-29 06:48:39.847 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:48:40 np0005539504 nova_compute[187152]: 2025-11-29 06:48:40.277 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:48:40 np0005539504 nova_compute[187152]: 2025-11-29 06:48:40.402 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:48:40 np0005539504 nova_compute[187152]: 2025-11-29 06:48:40.616 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:40 np0005539504 nova_compute[187152]: 2025-11-29 06:48:40.637 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:48:40 np0005539504 nova_compute[187152]: 2025-11-29 06:48:40.638 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:41 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:41Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.245 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.630s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:41 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:41Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.248 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:41 np0005539504 podman[214438]: 2025-11-29 06:48:41.267916573 +0000 UTC m=+0.502773420 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.326 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.328 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.328 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:41 np0005539504 podman[214437]: 2025-11-29 06:48:41.339738737 +0000 UTC m=+0.094644432 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.367 187156 INFO nova.compute.rpcapi [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.368 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.638 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.640 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:48:41 np0005539504 nova_compute[187152]: 2025-11-29 06:48:41.640 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:48:42 np0005539504 nova_compute[187152]: 2025-11-29 06:48:42.127 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:42 np0005539504 nova_compute[187152]: 2025-11-29 06:48:42.127 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:42 np0005539504 nova_compute[187152]: 2025-11-29 06:48:42.128 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:48:42 np0005539504 nova_compute[187152]: 2025-11-29 06:48:42.128 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4a10d3d0-27cb-4116-9924-cf8baaec591d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:42 np0005539504 nova_compute[187152]: 2025-11-29 06:48:42.277 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:42Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:e1:81 10.1.0.7
Nov 29 01:48:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:42Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:e1:81 10.1.0.7
Nov 29 01:48:43 np0005539504 nova_compute[187152]: 2025-11-29 06:48:43.426 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:47 np0005539504 nova_compute[187152]: 2025-11-29 06:48:47.279 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:48.306 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}42b7a977f68c13f5f34ed9a4a6321013432cd7e1ab36f19fb3a3541c74bf8d1b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:48:48 np0005539504 nova_compute[187152]: 2025-11-29 06:48:48.429 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.185 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 06:48:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-53ce782f-a958-4cce-835a-6a2fb8d94d13 x-openstack-request-id: req-53ce782f-a958-4cce-835a-6a2fb8d94d13 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.185 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.185 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-53ce782f-a958-4cce-835a-6a2fb8d94d13 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.188 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}42b7a977f68c13f5f34ed9a4a6321013432cd7e1ab36f19fb3a3541c74bf8d1b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:48:51 np0005539504 systemd-logind[783]: New session 27 of user nova.
Nov 29 01:48:51 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:48:51 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:48:51 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.327 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 29 Nov 2025 06:48:51 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-87701d1b-3374-4d11-a590-a715177b3cfe x-openstack-request-id: req-87701d1b-3374-4d11-a590-a715177b3cfe _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.327 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.328 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 used request id req-87701d1b-3374-4d11-a590-a715177b3cfe request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.334 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'name': 'tempest-LiveMigrationTest-server-1791593514', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'hostId': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.337 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'name': 'tempest-tempest.common.compute-instance-526752650-2', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'hostId': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.338 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:48:51 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.366 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.write.requests volume: 304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.368 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 podman[214491]: 2025-11-29 06:48:51.38992423 +0000 UTC m=+0.120993994 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.397 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.write.requests volume: 298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.397 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da17cf02-a531-47b7-9ba7-d182ddfa7817', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 304, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.338788', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76f7aec0-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '0541fa31b0d913264f19c42de77144fea21e90c0fa1e2b140559aeb89f19a435'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.338788', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76f7cdd8-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '28d59c3698b5d23c182176cfbe37e074f55a1830a0516e11e9b80077de70fdc2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 298, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.338788', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76fc3d5a-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': 'eab607dad8dc189972064a9a027d21dc7df199e225ef7e09ec18ca9eefd4ef44'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.338788', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76fc489a-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': '0f411e01cfe6102bc4ae00de6175d64705291a53284457983d3b58ea2e4664d3'}]}, 'timestamp': '2025-11-29 06:48:51.398216', '_unique_id': 'c02fb41742f84a9f92d137f3f72cde73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.407 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.411 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.411 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>]
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:48:51 np0005539504 podman[214493]: 2025-11-29 06:48:51.416760076 +0000 UTC m=+0.140732718 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.417 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for af865d23-0f24-47aa-aeab-1c12d04b5a1e / tap60d45f94-ad inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.417 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.419 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4a10d3d0-27cb-4116-9924-cf8baaec591d / tap52732bd8-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.419 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ec3223b-307d-46ed-b92a-431a41adacb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.412195', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '76ff4f36-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': '9830f6150c4575a6ceb6432669b02c9f6105a6b7d204528fa25a95f0ace76da0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.412195', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '76ff9a90-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': '7bb9738677f23845ffb47427a9809f052481ca972ffd725829bffe1efef7c534'}]}, 'timestamp': '2025-11-29 06:48:51.419953', '_unique_id': '129f5049a51c4f0896109e7348b634ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.420 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.421 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.421 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.421 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>]
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.421 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.441 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/cpu volume: 12760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.457 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/cpu volume: 13360000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b1c46b8-654a-49fb-a6c7-eab0f969a466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12760000000, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'timestamp': '2025-11-29T06:48:51.422026', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7702ffaa-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.375850207, 'message_signature': 'b3525c200241a3cf826d0e1c5cbb1d601f7a7865ab90b9224d7947e617142b36'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13360000000, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'timestamp': '2025-11-29T06:48:51.422026', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7705782a-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.392472607, 'message_signature': 'fed4e90d08e284cc9e438cc28e2eec2abfc7f45024119f5a355e40c9c49cf569'}]}, 'timestamp': '2025-11-29 06:48:51.458496', '_unique_id': '99e5b134a18c43008b6ed19a020cd8bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.459 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.460 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.460 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.460 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>]
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.471 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.471 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.480 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.481 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de430134-a114-434f-a529-905ab25f4f34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.461267', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77077cec-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.396103776, 'message_signature': 'c8f383497437aef6c5fc55ca4b5bad670773cc319b27169d3ce4f4e217a38988'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.461267', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7707869c-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.396103776, 'message_signature': '282593da066cf7bb7d2bba8e92e1704ea4f6392fe0967d835b845ce070f2760e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.461267', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7708f036-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.406484306, 'message_signature': '5be70b77e38d1b813641ea29e6b15039eda69620acc0bbf249c693a28693cd2b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.461267', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7708f874-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.406484306, 'message_signature': '43afffb7043dc29f49b94593ea46f2248d020c9279825251882a43cfb73595e4'}]}, 'timestamp': '2025-11-29 06:48:51.481303', '_unique_id': '7017aefc45c444ccbefd5d436821837b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.482 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84a9e295-c882-4363-ba5d-c6ca7abb8ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.482738', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '7709396a-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': '07b02b686ebf1217591dc3bff7845494f06bb9c43abab156317e5aa12723f2f6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.482738', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '77094216-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': '1e6c091e3bed64d7fdfa2d93e31ff8cfdf66b6d30ddad80da9123386e302e7a0'}]}, 'timestamp': '2025-11-29 06:48:51.483190', '_unique_id': '9a76f181a3924babb79462f6f3c7e834'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.483 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.484 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.484 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.484 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11181657-f404-4526-b7d1-a779028aae4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.484320', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '77097790-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': 'e6a430ba36811c077c32ec098365ec88695ebb9c47e03f41c036c23f7f6b5acc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.484320', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '77098000-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': '4ccaecda41f6fb59c4df97eab3b7cc5fb7ee0e6f410d052928381bf8c11d4f8e'}]}, 'timestamp': '2025-11-29 06:48:51.484772', '_unique_id': '59c3af02363548a0b1056eec31e16725'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.485 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Queued start job for default target Main User Target.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d15e896-3b4a-49bc-b2af-9d9c6e251e86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.485854', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '7709b386-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': '054b166b130a791fc4007033722dcab0630d86db27d0d975a31d6748237700b3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.485854', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '7709bbd8-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': '203bec5baa42a05519c9873efa9fd43814ed201b54b3172f171591099b1a2e84'}]}, 'timestamp': '2025-11-29 06:48:51.486305', '_unique_id': '48baa5fea07a41f0beee25af53f56fe7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.486 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.487 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.487 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c64853fe-7ae8-4c34-90b7-9c50618d0f26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.487426', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '7709f044-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': 'f8a6353fcecb2a347f5b7e525b30f157fbbfcd1647a8611f4ba8f966962ba229'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.487426', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '7709f882-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': 'f734c8a5eefe6ac5c7a1854dcb26eff7e57a38b6696a741803aa5b16d206f385'}]}, 'timestamp': '2025-11-29 06:48:51.487858', '_unique_id': '8bedc78f40e741108deec26134a321ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.488 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.489 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.489 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.read.requests volume: 1084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.489 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c97b7782-61d5-4e96-88be-eeec0abc22de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.488950', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770a2b90-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '1ee3aab1a89a4ca4f8cb545b872815b12725e5e658bbf7736a8d279549e3a16e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.488950', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770a33a6-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '9013d6611711c5311315cd05a10d08eb59d2e4f3666068e607fd44d753e25488'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1084, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.488950', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770a3be4-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': 'afa25cc35c6ff2f9848a50dcddc0669dee60bb4b5a9abb7c52fae7dba523fd68'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.488950', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770a433c-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': 'd74c0d215734b81acb4c1de05cbaa5bd3ab560f282bf080bc0ffaafb05c3dd10'}]}, 'timestamp': '2025-11-29 06:48:51.489761', '_unique_id': '442b428675e2404d83d000622b4d67b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.490 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.491 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.491 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.read.bytes volume: 30194176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.491 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42c2371f-436d-4c71-a4a4-52065b83158b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.490968', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770a7b22-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '0f6a80de413c5ce772e6a4fa5cc4705c61ed39935a6fa1e974a67ec05da09d56'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.490968', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770a832e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': 'd67c4d6106a171f91560d217ef68a08aab730a5424b8397c7aaeb331b746f4ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30194176, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.490968', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770a8c48-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': '5ac762a4757e0945f8c2bacc9b406d9a8674ab77132e9c2f2e68accb258d8b84'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.490968', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770a93b4-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': '8e6d62ada33e4192b3e053bd76a9ac37c8e1bb3ea18269873e0fe77be15e1b12'}]}, 'timestamp': '2025-11-29 06:48:51.491823', '_unique_id': 'e978c242bf3541ed9db794f616df4ed9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.493 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.493 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.493 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.493 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '547a461f-8bbc-478c-8d9b-728c37cf990c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.493001', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770ac9e2-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.396103776, 'message_signature': '67920c7df795efbe77f69b5f4f9a990f4f875822b19ff01530cf8b8be248da84'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.493001', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770ad1ee-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.396103776, 'message_signature': '3c5ee35edb35a94c62f22f675a08c8aa811290945fabbc432e8e6bad546ee226'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.493001', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770adaea-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.406484306, 'message_signature': '36616e9c9ca9173043abbd78e3731eceb1557f075078572cdebbe297d108f0a6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.493001', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770ae26a-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.406484306, 'message_signature': '11cd5cd3815522c6c77f3e9e3de22bbe4d4c60af8a5865cdc01818b7d54adc00'}]}, 'timestamp': '2025-11-29 06:48:51.493838', '_unique_id': '3f12e499b5cb431d96f9e0b1dcb2347c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Created slice User Application Slice.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2246b22-3ed3-4267-9f40-a7dab2df5065', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.494980', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '770b174e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': 'ad8d694b966319cf21a1c64185a1468ba25e9a84c0b0f07e70ff84dba4d87a37'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.494980', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '770b1fa0-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': 'fd83d69c9aa7b519330f49176e60fea7047c41ac14cd70f21e248fd129a0ebbc'}]}, 'timestamp': '2025-11-29 06:48:51.495451', '_unique_id': 'f1f14a263f7642879ba72ed20c946597'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 systemd[214514]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 systemd[214514]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 systemd[214514]: Reached target Paths.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 systemd[214514]: Reached target Timers.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.495 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.496 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.496 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1791593514>, <NovaLikeServer: tempest-tempest.common.compute-instance-526752650-2>]
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.497 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.497 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.497 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '519fb634-3bb5-4391-9fab-3f85b0914893', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.497103', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '770b6a1e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': '7660281b5dd9d29fc797b623ef48decf744ab86f4264aeda0a7f7df8aee1853b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.497103', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '770b72de-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': 'd5672544dc00a37b37a2a7e0947c89c7aa36565ab9ab102952741d39dc72c3a0'}]}, 'timestamp': '2025-11-29 06:48:51.497573', '_unique_id': 'bb22621b24554d61b3cf2c0d5dbcf9e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/memory.usage volume: 46.73046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.498 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/memory.usage volume: 42.34765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Starting D-Bus User Message Bus Socket...
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '193e841f-e1f6-4fa8-b6dc-4cd424711a16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.73046875, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'timestamp': '2025-11-29T06:48:51.498735', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '770baa4c-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.375850207, 'message_signature': '0920750d112f8625e39163ee09bf3327e940b56e8b240f304abff936b2c8b526'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.34765625, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'timestamp': '2025-11-29T06:48:51.498735', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '770bb24e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.392472607, 'message_signature': '38ffe6f76f215f7eb47a23c6a1ce2025a7e785aa711e002b908497482c2fd7db'}]}, 'timestamp': '2025-11-29 06:48:51.499163', '_unique_id': 'ed3e53919d68449e9c24ec7f7a7c6db7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.499 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.500 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.500 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Starting Create User's Volatile Files and Directories...
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f77ff167-107f-4bd8-9e7c-ddf68addf8fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.500362', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '770beaca-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': '46178daf32596a9f30383014fc58116238bf516f144e91a42944bcbf130b3eb1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.500362', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '770bf330-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': 'ed19ab5b106c52394b68689e1978022ce10e8632cf6b532993a477997a91ba47'}]}, 'timestamp': '2025-11-29 06:48:51.500830', '_unique_id': '683fe48e40804a57a79e619b5c51114f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.501 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c2602ce-05d2-4b85-900c-2e895a2e2d2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.501945', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '770c2738-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': 'becb78186c61c559dfbfe270398c8f3c9ad18bedf8268b9fc7b1931aa261b671'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.501945', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '770c2f4e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': '2f0ca5fb2dba4d3188dbb79970529015298a336101cd59cd22e851d7ee5cec75'}]}, 'timestamp': '2025-11-29 06:48:51.502401', '_unique_id': 'c0a1cb7b09db4157ab32fe19f4bb3c90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.502 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.503 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.503 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.write.latency volume: 15617486755 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.503 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.503 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.write.latency volume: 5397917221 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8672454a-5c6f-48e7-be59-9e344ab8c2f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15617486755, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.503512', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770c646e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '7c7cacc629464b4fd5760d580aff9704a09a5685afb5f5d0adbf5a600dc639d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.503512', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770c6c8e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': 'c6d59870728fcf6b3b90d7e20c571533602b449cc256cfde5efbdd46dc328472'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5397917221, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.503512', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770c7454-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': 'c1a61dfcb66170d0493737c0b0ab524a992dacb154d11c21a39cd4bac5f2a47f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.503512', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770c7c74-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': '2c4697a12a656f2b34cdd5ee0fb6ec2838139b768d0f03e868d296ea2a97425d'}]}, 'timestamp': '2025-11-29 06:48:51.504348', '_unique_id': '58bdebe95d194d55b2c72d7abcf35b91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.504 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.505 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.write.bytes volume: 72822784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.505 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.505 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.write.bytes volume: 72835072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40f24ad2-5311-4de8-8666-4da8d95ae54d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72822784, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.505537', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770cb46e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': 'f14793a956ab017a093182b86ea8022918312f649b9b7a514cc84c817148f74d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.505537', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770cbc34-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': 'f0bece0160ab78479a81e98a7eedece441d7624aef58282a7675902b08242ab3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72835072, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.505537', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770cc3dc-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': 'a232a497ecf6dc78ba5a197aee3b318c23d1bf3896a849c4a2aa87744ed961ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.505537', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770ccb66-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': '840de96f678c495e78f4db4911de4cd14d6a026dab87ee52e388eb8674ee0435'}]}, 'timestamp': '2025-11-29 06:48:51.506372', '_unique_id': '1777b8e137a04335b5ad49d608d76ba2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.506 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.507 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.507 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e204e85-b50e-4e99-882b-86d1ed888fef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'instance-00000007-af865d23-0f24-47aa-aeab-1c12d04b5a1e-tap60d45f94-ad', 'timestamp': '2025-11-29T06:48:51.507601', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'tap60d45f94-ad', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:09:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60d45f94-ad'}, 'message_id': '770d046e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.347058899, 'message_signature': 'e58af72703c4940356580ff97ebb9b10b11ea0964c16bea4e71c68d97f0e5264'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'instance-00000005-4a10d3d0-27cb-4116-9924-cf8baaec591d-tap52732bd8-18', 'timestamp': '2025-11-29T06:48:51.507601', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'tap52732bd8-18', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c2:e1:81', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap52732bd8-18'}, 'message_id': '770d0cd4-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.352803014, 'message_signature': '85aef777b0e0d19d4fe74f127a288c780dd0bab724d36b98743bde3359af1b23'}]}, 'timestamp': '2025-11-29 06:48:51.508040', '_unique_id': '294ba805e10a41fc8cb0089ab46af5da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.508 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.509 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.509 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.read.latency volume: 283474340 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.509 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.read.latency volume: 559226180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.509 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.read.latency volume: 413585084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.509 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.read.latency volume: 652673818 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5816e8a5-42ee-4c34-8c8b-8e73e703fe2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 283474340, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.509214', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770d44b0-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': 'e3287cb771a100567d86538670c47de570ed060d2d85f69ced645e8004c8ed67'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 559226180, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.509214', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770d4e60-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.273669183, 'message_signature': '7b1c7f6b94a69bc9e592e23664407a655f7f7df9ba6f9b097e8301cb74574dab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 413585084, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.509214', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770d57fc-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': 'ebb732b18e86e70a754b592600d57622712c2b7a27df005c3946a2e326a5f6ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 652673818, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.509214', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770d5fae-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.303567393, 'message_signature': '5a56d459df51d598ac5b250dbc6885cb6820414a033b80945434da6df09f59ee'}]}, 'timestamp': '2025-11-29 06:48:51.510152', '_unique_id': '9c5e62489317426990ce10652654c935'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.510 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.511 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.511 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Finished Create User's Volatile Files and Directories.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.511 12 DEBUG ceilometer.compute.pollsters [-] af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.511 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 DEBUG ceilometer.compute.pollsters [-] 4a10d3d0-27cb-4116-9924-cf8baaec591d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:48:51 np0005539504 systemd[214514]: Reached target Sockets.
Nov 29 01:48:51 np0005539504 systemd[214514]: Reached target Basic System.
Nov 29 01:48:51 np0005539504 systemd[214514]: Reached target Main User Target.
Nov 29 01:48:51 np0005539504 systemd[214514]: Startup finished in 143ms.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da280b17-22f1-4276-9856-ee555702a72c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-vda', 'timestamp': '2025-11-29T06:48:51.511476', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770d9c4e-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.396103776, 'message_signature': '0081ce6d3dedb0ef94296a97bfaa5745bd92375e0a84bfc6ef205678d2cc3533'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_name': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_name': None, 'resource_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e-sda', 'timestamp': '2025-11-29T06:48:51.511476', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1791593514', 'name': 'instance-00000007', 'instance_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'instance_type': 'm1.nano', 'host': '661186105a89c142197a60b85c76337e2ec5a431cf9734a050db3abc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770da63a-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.396103776, 'message_signature': 'a5a1587d48e9e32f69f1309529ecca2bfd68c27e093ed556e1cb0a641756900c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-vda', 'timestamp': '2025-11-29T06:48:51.511476', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '770dae00-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.406484306, 'message_signature': '0e00e3d2c1ece7b222687b2595406642a987ac350af8b1ea5182143918998563'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d-sda', 'timestamp': '2025-11-29T06:48:51.511476', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-526752650-2', 'name': 'instance-00000005', 'instance_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'instance_type': 'm1.nano', 'host': '5685cbfce32a291b260ba7bf1e0448a1844f352f4a0c23a45453f8d3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770db648-ccef-11f0-8a11-fa163ea726b4', 'monotonic_time': 4394.406484306, 'message_signature': 'e7c04058075b081b2bc40f572a2fe94bdb359bac1de326a14c7da1bf07768327'}]}, 'timestamp': '2025-11-29 06:48:51.512416', '_unique_id': '8a7ade37111948719bd3deb064a72163'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:48:51 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:48:51 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:48:51.512 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:48:51 np0005539504 systemd[1]: Started Session 27 of User nova.
Nov 29 01:48:51 np0005539504 systemd[1]: session-27.scope: Deactivated successfully.
Nov 29 01:48:51 np0005539504 systemd-logind[783]: Session 27 logged out. Waiting for processes to exit.
Nov 29 01:48:51 np0005539504 systemd-logind[783]: Removed session 27.
Nov 29 01:48:52 np0005539504 nova_compute[187152]: 2025-11-29 06:48:52.282 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.307 187156 DEBUG nova.compute.manager [req-e06e1891-6bfb-4c56-9a5d-e3d6ca676161 req-d2ced48c-60a3-4e80-a275-b97e0bfebf1b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.308 187156 DEBUG oslo_concurrency.lockutils [req-e06e1891-6bfb-4c56-9a5d-e3d6ca676161 req-d2ced48c-60a3-4e80-a275-b97e0bfebf1b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.308 187156 DEBUG oslo_concurrency.lockutils [req-e06e1891-6bfb-4c56-9a5d-e3d6ca676161 req-d2ced48c-60a3-4e80-a275-b97e0bfebf1b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.309 187156 DEBUG oslo_concurrency.lockutils [req-e06e1891-6bfb-4c56-9a5d-e3d6ca676161 req-d2ced48c-60a3-4e80-a275-b97e0bfebf1b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.309 187156 DEBUG nova.compute.manager [req-e06e1891-6bfb-4c56-9a5d-e3d6ca676161 req-d2ced48c-60a3-4e80-a275-b97e0bfebf1b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.309 187156 DEBUG nova.compute.manager [req-e06e1891-6bfb-4c56-9a5d-e3d6ca676161 req-d2ced48c-60a3-4e80-a275-b97e0bfebf1b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.470 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.581 187156 INFO nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Took 12.25 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.581 187156 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.633 187156 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e8267ed6-ce75-49c9-85a6-d08b827f6aea),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.671 187156 DEBUG nova.objects.instance [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'migration_context' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.673 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.676 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.676 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.731 187156 DEBUG nova.virt.libvirt.vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:48:27Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.732 187156 DEBUG nova.network.os_vif_util [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.735 187156 DEBUG nova.network.os_vif_util [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.737 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 01:48:53 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:86:09:58"/>
Nov 29 01:48:53 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 01:48:53 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:48:53 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 01:48:53 np0005539504 nova_compute[187152]:  <target dev="tap60d45f94-ad"/>
Nov 29 01:48:53 np0005539504 nova_compute[187152]: </interface>
Nov 29 01:48:53 np0005539504 nova_compute[187152]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 29 01:48:53 np0005539504 nova_compute[187152]: 2025-11-29 06:48:53.739 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.179 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.180 187156 INFO nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.313 187156 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.372 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Updating instance_info_cache with network_info: [{"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.423 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-4a10d3d0-27cb-4116-9924-cf8baaec591d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.423 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.424 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.424 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.425 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.827 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:48:54 np0005539504 nova_compute[187152]: 2025-11-29 06:48:54.827 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:48:54 np0005539504 podman[214560]: 2025-11-29 06:48:54.915580943 +0000 UTC m=+0.073747927 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.331 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.333 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.490 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398935.489776, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.491 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.837 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.838 187156 DEBUG nova.virt.libvirt.migration [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:48:55 np0005539504 kernel: tap60d45f94-ad (unregistering): left promiscuous mode
Nov 29 01:48:55 np0005539504 NetworkManager[55210]: <info>  [1764398935.9201] device (tap60d45f94-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.936 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:55 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:55Z|00037|binding|INFO|Releasing lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c from this chassis (sb_readonly=0)
Nov 29 01:48:55 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:55Z|00038|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c down in Southbound
Nov 29 01:48:55 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:55Z|00039|binding|INFO|Removing iface tap60d45f94-ad ovn-installed in OVS
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.944 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:55 np0005539504 nova_compute[187152]: 2025-11-29 06:48:55.971 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539504 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 29 01:48:56 np0005539504 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 15.723s CPU time.
Nov 29 01:48:56 np0005539504 systemd-machined[153423]: Machine qemu-4-instance-00000007 terminated.
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.116 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.121 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.157 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.158 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.158 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.341 187156 DEBUG nova.virt.libvirt.guest [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'af865d23-0f24-47aa-aeab-1c12d04b5a1e' (instance-00000007) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.343 187156 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration operation has completed#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.343 187156 INFO nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] _post_live_migration() is started..#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.605 187156 DEBUG nova.compute.manager [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.606 187156 DEBUG oslo_concurrency.lockutils [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.606 187156 DEBUG oslo_concurrency.lockutils [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.607 187156 DEBUG oslo_concurrency.lockutils [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.608 187156 DEBUG nova.compute.manager [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.608 187156 WARNING nova.compute.manager [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.608 187156 DEBUG nova.compute.manager [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-changed-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.608 187156 DEBUG nova.compute.manager [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Refreshing instance network info cache due to event network-changed-60d45f94-ad4f-48ba-a0a9-6b5406aa616c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.609 187156 DEBUG oslo_concurrency.lockutils [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.609 187156 DEBUG oslo_concurrency.lockutils [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.609 187156 DEBUG nova.network.neutron [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Refreshing network info cache for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.622 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:56 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:56Z|00040|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:48:56 np0005539504 ovn_controller[95182]: 2025-11-29T06:48:56Z|00041|binding|INFO|Releasing lport c143daec-964e-4591-a13b-43e2014d70b5 from this chassis (sb_readonly=0)
Nov 29 01:48:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:56.657 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:09:58 10.100.0.4'], port_security=['fa:16:3e:86:09:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60d45f94-ad4f-48ba-a0a9-6b5406aa616c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:48:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:56.659 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis#033[00m
Nov 29 01:48:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:56.660 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:48:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:56.663 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97a965c2-4d3b-46d8-8c9e-2566fe7c1746]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:56.664 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace which is not needed anymore#033[00m
Nov 29 01:48:56 np0005539504 nova_compute[187152]: 2025-11-29 06:48:56.706 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [NOTICE]   (214341) : haproxy version is 2.8.14-c23fe91
Nov 29 01:48:56 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [NOTICE]   (214341) : path to executable is /usr/sbin/haproxy
Nov 29 01:48:56 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [WARNING]  (214341) : Exiting Master process...
Nov 29 01:48:56 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [ALERT]    (214341) : Current worker (214343) exited with code 143 (Terminated)
Nov 29 01:48:56 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214336]: [WARNING]  (214341) : All workers exited. Exiting... (0)
Nov 29 01:48:56 np0005539504 systemd[1]: libpod-0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08.scope: Deactivated successfully.
Nov 29 01:48:56 np0005539504 podman[214623]: 2025-11-29 06:48:56.900338177 +0000 UTC m=+0.058343460 container died 0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:48:56 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08-userdata-shm.mount: Deactivated successfully.
Nov 29 01:48:56 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9fc508bd8a31fea3c3d1c47ca53c97d987b9e12ad15c5cb27f0b4c6e1e0a5345-merged.mount: Deactivated successfully.
Nov 29 01:48:56 np0005539504 podman[214623]: 2025-11-29 06:48:56.987640498 +0000 UTC m=+0.145645781 container cleanup 0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:48:56 np0005539504 systemd[1]: libpod-conmon-0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08.scope: Deactivated successfully.
Nov 29 01:48:57 np0005539504 podman[214654]: 2025-11-29 06:48:57.16520216 +0000 UTC m=+0.149445492 container remove 0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.175 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4d092f-d6f7-4570-9798-8a6280ce2058]: (4, ('Sat Nov 29 06:48:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08)\n0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08\nSat Nov 29 06:48:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08)\n0dd9e5fd65d9314d01cb106e7f6df790c241cef516c91d46c246244441ae1f08\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.177 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[781cb902-a70e-4bb9-bfb5-8eb6e4308f80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.178 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:57 np0005539504 nova_compute[187152]: 2025-11-29 06:48:57.181 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:57 np0005539504 kernel: tap24ee44f0-20: left promiscuous mode
Nov 29 01:48:57 np0005539504 nova_compute[187152]: 2025-11-29 06:48:57.212 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.220 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e789b31a-094b-4bc6-b026-2de75fc09c77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.241 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2782dbee-3d68-4cec-81d7-f2f6c4f31217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.243 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1cffd670-8325-4095-9ca9-e7cdbd796f7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.263 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[82db3ede-e226-4dd9-a147-9398ef8393fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436812, 'reachable_time': 34395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214674, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 systemd[1]: run-netns-ovnmeta\x2d24ee44f0\x2d2b10\x2d459c\x2daabf\x2dbf9ef2c8d950.mount: Deactivated successfully.
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.278 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:48:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:48:57.280 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[84c113fc-0aa3-4308-89bb-3a04b32eabc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:57 np0005539504 nova_compute[187152]: 2025-11-29 06:48:57.286 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:58 np0005539504 nova_compute[187152]: 2025-11-29 06:48:58.472 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:00 np0005539504 podman[214676]: 2025-11-29 06:49:00.73507142 +0000 UTC m=+0.074727912 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.264 187156 DEBUG nova.compute.manager [req-c591bf5b-1135-4e91-a982-ab010a1d48f8 req-0b704970-a3b1-447b-890c-8b6d50ad763d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.264 187156 DEBUG oslo_concurrency.lockutils [req-c591bf5b-1135-4e91-a982-ab010a1d48f8 req-0b704970-a3b1-447b-890c-8b6d50ad763d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.264 187156 DEBUG oslo_concurrency.lockutils [req-c591bf5b-1135-4e91-a982-ab010a1d48f8 req-0b704970-a3b1-447b-890c-8b6d50ad763d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.265 187156 DEBUG oslo_concurrency.lockutils [req-c591bf5b-1135-4e91-a982-ab010a1d48f8 req-0b704970-a3b1-447b-890c-8b6d50ad763d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.265 187156 DEBUG nova.compute.manager [req-c591bf5b-1135-4e91-a982-ab010a1d48f8 req-0b704970-a3b1-447b-890c-8b6d50ad763d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.265 187156 DEBUG nova.compute.manager [req-c591bf5b-1135-4e91-a982-ab010a1d48f8 req-0b704970-a3b1-447b-890c-8b6d50ad763d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:01 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:49:01 np0005539504 systemd[214514]: Activating special unit Exit the Session...
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped target Main User Target.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped target Basic System.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped target Paths.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped target Sockets.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped target Timers.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:49:01 np0005539504 systemd[214514]: Closed D-Bus User Message Bus Socket.
Nov 29 01:49:01 np0005539504 systemd[214514]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:49:01 np0005539504 systemd[214514]: Removed slice User Application Slice.
Nov 29 01:49:01 np0005539504 systemd[214514]: Reached target Shutdown.
Nov 29 01:49:01 np0005539504 systemd[214514]: Finished Exit the Session.
Nov 29 01:49:01 np0005539504 systemd[214514]: Reached target Exit the Session.
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.887 187156 DEBUG nova.network.neutron [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Activated binding for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.888 187156 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.889 187156 DEBUG nova.virt.libvirt.vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:48:31Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.889 187156 DEBUG nova.network.os_vif_util [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.890 187156 DEBUG nova.network.os_vif_util [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.891 187156 DEBUG os_vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.897 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.898 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60d45f94-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.899 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:01 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:49:01 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.901 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.912 187156 INFO os_vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad')#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.912 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.913 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.913 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.914 187156 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.915 187156 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deleting instance files /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e_del#033[00m
Nov 29 01:49:01 np0005539504 nova_compute[187152]: 2025-11-29 06:49:01.915 187156 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deletion of /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e_del complete#033[00m
Nov 29 01:49:01 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:49:01 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:49:01 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:49:01 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:49:01 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:49:02 np0005539504 nova_compute[187152]: 2025-11-29 06:49:02.609 187156 DEBUG nova.network.neutron [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updated VIF entry in instance network info cache for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:49:02 np0005539504 nova_compute[187152]: 2025-11-29 06:49:02.610 187156 DEBUG nova.network.neutron [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:02 np0005539504 nova_compute[187152]: 2025-11-29 06:49:02.641 187156 DEBUG oslo_concurrency.lockutils [req-7172c4d1-fa90-4216-addb-549509173284 req-638bd5d4-a049-4b32-a8f4-5b2df17b7f4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:03 np0005539504 nova_compute[187152]: 2025-11-29 06:49:03.620 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.553 187156 DEBUG nova.compute.manager [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.553 187156 DEBUG oslo_concurrency.lockutils [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.553 187156 DEBUG oslo_concurrency.lockutils [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.554 187156 DEBUG oslo_concurrency.lockutils [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.554 187156 DEBUG nova.compute.manager [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.554 187156 DEBUG nova.compute.manager [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.650 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.650 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.650 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.650 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.650 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.650 187156 WARNING nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 WARNING nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.651 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.652 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.652 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.652 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.652 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.653 187156 WARNING nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.653 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.653 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.653 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.653 187156 DEBUG oslo_concurrency.lockutils [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.654 187156 DEBUG nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:05 np0005539504 nova_compute[187152]: 2025-11-29 06:49:05.654 187156 WARNING nova.compute.manager [req-96a18d36-b521-4959-bad6-a005e6761c78 req-984201ac-7453-4172-9bef-a13793e5ea6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:06 np0005539504 nova_compute[187152]: 2025-11-29 06:49:06.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539504 nova_compute[187152]: 2025-11-29 06:49:08.622 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539504 podman[214701]: 2025-11-29 06:49:08.722185949 +0000 UTC m=+0.060342853 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.201 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.201 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.201 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.225 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.226 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.226 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.227 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.315 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.401 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.402 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.458 187156 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.631 187156 WARNING nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.633 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5622MB free_disk=73.31514358520508GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.633 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.634 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.699 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Migration for instance af865d23-0f24-47aa-aeab-1c12d04b5a1e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.735 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.784 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Instance 4a10d3d0-27cb-4116-9924-cf8baaec591d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.785 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Migration e8267ed6-ce75-49c9-85a6-d08b827f6aea is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.785 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.785 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.860 187156 DEBUG nova.compute.provider_tree [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.885 187156 DEBUG nova.scheduler.client.report [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.938 187156 DEBUG nova.compute.resource_tracker [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.938 187156 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:10 np0005539504 nova_compute[187152]: 2025-11-29 06:49:10.953 187156 INFO nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.093 187156 INFO nova.scheduler.client.report [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Deleted allocation for migration e8267ed6-ce75-49c9-85a6-d08b827f6aea#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.093 187156 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.626 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398936.1543343, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.627 187156 INFO nova.compute.manager [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.652 187156 DEBUG nova.compute.manager [None req-0cabc0e0-9e3a-4805-a967-fb9ed0a187cd - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.662 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.663 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.663 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.664 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.664 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.682 187156 INFO nova.compute.manager [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Terminating instance#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.694 187156 DEBUG nova.compute.manager [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:49:11 np0005539504 kernel: tap52732bd8-18 (unregistering): left promiscuous mode
Nov 29 01:49:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:11Z|00042|binding|INFO|Releasing lport 52732bd8-180c-4935-84b3-9f7f3e46c276 from this chassis (sb_readonly=0)
Nov 29 01:49:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:11Z|00043|binding|INFO|Setting lport 52732bd8-180c-4935-84b3-9f7f3e46c276 down in Southbound
Nov 29 01:49:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:11Z|00044|binding|INFO|Removing iface tap52732bd8-18 ovn-installed in OVS
Nov 29 01:49:11 np0005539504 NetworkManager[55210]: <info>  [1764398951.7305] device (tap52732bd8-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.728 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.745 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:11 np0005539504 podman[214727]: 2025-11-29 06:49:11.749276678 +0000 UTC m=+0.079446660 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:49:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:11.764 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:e1:81 10.1.0.7 fdfe:381f:8400::306'], port_security=['fa:16:3e:c2:e1:81 10.1.0.7 fdfe:381f:8400::306'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.7/26 fdfe:381f:8400::306/64', 'neutron:device_id': '4a10d3d0-27cb-4116-9924-cf8baaec591d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-425e933e-ca72-466c-8d2b-499c7ba67318', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5cacaa01-dff2-46af-9e49-4a741508795b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=236265de-856a-468e-8ed3-00d3e824203d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=52732bd8-180c-4935-84b3-9f7f3e46c276) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:11.765 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 52732bd8-180c-4935-84b3-9f7f3e46c276 in datapath 425e933e-ca72-466c-8d2b-499c7ba67318 unbound from our chassis#033[00m
Nov 29 01:49:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:11.766 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 425e933e-ca72-466c-8d2b-499c7ba67318, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:49:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:11.767 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[26cfb63a-fe9e-423f-95d0-19a664bb377e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:11.768 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 namespace which is not needed anymore#033[00m
Nov 29 01:49:11 np0005539504 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 29 01:49:11 np0005539504 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 16.224s CPU time.
Nov 29 01:49:11 np0005539504 systemd-machined[153423]: Machine qemu-3-instance-00000005 terminated.
Nov 29 01:49:11 np0005539504 podman[214728]: 2025-11-29 06:49:11.786555506 +0000 UTC m=+0.109481232 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:49:11 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [NOTICE]   (214217) : haproxy version is 2.8.14-c23fe91
Nov 29 01:49:11 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [NOTICE]   (214217) : path to executable is /usr/sbin/haproxy
Nov 29 01:49:11 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [WARNING]  (214217) : Exiting Master process...
Nov 29 01:49:11 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [ALERT]    (214217) : Current worker (214219) exited with code 143 (Terminated)
Nov 29 01:49:11 np0005539504 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214205]: [WARNING]  (214217) : All workers exited. Exiting... (0)
Nov 29 01:49:11 np0005539504 systemd[1]: libpod-0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024.scope: Deactivated successfully.
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.918 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:11 np0005539504 podman[214793]: 2025-11-29 06:49:11.921953859 +0000 UTC m=+0.047327892 container died 0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.945 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:11 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024-userdata-shm.mount: Deactivated successfully.
Nov 29 01:49:11 np0005539504 systemd[1]: var-lib-containers-storage-overlay-ac2c284a62f6d93bf6fc3f34a1ff10a0a83b78cb77356cd528fd2b36228e8df8-merged.mount: Deactivated successfully.
Nov 29 01:49:11 np0005539504 podman[214793]: 2025-11-29 06:49:11.969282339 +0000 UTC m=+0.094656372 container cleanup 0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.968 187156 INFO nova.virt.libvirt.driver [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Instance destroyed successfully.#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.968 187156 DEBUG nova.objects.instance [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'resources' on Instance uuid 4a10d3d0-27cb-4116-9924-cf8baaec591d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:11 np0005539504 systemd[1]: libpod-conmon-0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024.scope: Deactivated successfully.
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.996 187156 DEBUG nova.virt.libvirt.vif [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-2',id=5,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-29T06:48:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:48:27Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=4a10d3d0-27cb-4116-9924-cf8baaec591d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.997 187156 DEBUG nova.network.os_vif_util [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "52732bd8-180c-4935-84b3-9f7f3e46c276", "address": "fa:16:3e:c2:e1:81", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::306", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap52732bd8-18", "ovs_interfaceid": "52732bd8-180c-4935-84b3-9f7f3e46c276", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.998 187156 DEBUG nova.network.os_vif_util [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:49:11 np0005539504 nova_compute[187152]: 2025-11-29 06:49:11.998 187156 DEBUG os_vif [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.000 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.000 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52732bd8-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.002 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.004 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.007 187156 INFO os_vif [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:e1:81,bridge_name='br-int',has_traffic_filtering=True,id=52732bd8-180c-4935-84b3-9f7f3e46c276,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap52732bd8-18')#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.008 187156 INFO nova.virt.libvirt.driver [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Deleting instance files /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d_del#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.009 187156 INFO nova.virt.libvirt.driver [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Deletion of /var/lib/nova/instances/4a10d3d0-27cb-4116-9924-cf8baaec591d_del complete#033[00m
Nov 29 01:49:12 np0005539504 podman[214838]: 2025-11-29 06:49:12.038483581 +0000 UTC m=+0.048604256 container remove 0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.044 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9eeb82-022a-47c9-aeae-75ff21a678fc]: (4, ('Sat Nov 29 06:49:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 (0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024)\n0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024\nSat Nov 29 06:49:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 (0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024)\n0306059555f750b8883971cd0b9d633e028a634a7afafca18069a7a3bfa48024\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.046 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[595dbb54-9413-4707-b110-babdcade7c1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.047 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap425e933e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.048 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539504 kernel: tap425e933e-c0: left promiscuous mode
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.051 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.055 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e0590a-7274-4e04-8a71-7961907b09c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.063 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.073 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e52a0139-e393-48c1-b91e-5e91d4df53d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.075 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f5be2b9e-57cd-4268-b8f0-addf0ceb1209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.095 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97383996-1824-49d7-912a-fd40e7a55111]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436418, 'reachable_time': 39321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214851, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.100 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:49:12 np0005539504 systemd[1]: run-netns-ovnmeta\x2d425e933e\x2dca72\x2d466c\x2d8d2b\x2d499c7ba67318.mount: Deactivated successfully.
Nov 29 01:49:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:12.100 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3123a36d-be4f-40c5-aa14-949a3d3aa1f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.183 187156 INFO nova.compute.manager [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Took 0.49 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.184 187156 DEBUG oslo.service.loopingcall [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.184 187156 DEBUG nova.compute.manager [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.184 187156 DEBUG nova.network.neutron [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.406 187156 DEBUG nova.compute.manager [req-03072fe0-5693-40fa-bd99-67d03532d607 req-e8bd0141-4180-47cf-ab92-ada2ee783d14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-vif-unplugged-52732bd8-180c-4935-84b3-9f7f3e46c276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.407 187156 DEBUG oslo_concurrency.lockutils [req-03072fe0-5693-40fa-bd99-67d03532d607 req-e8bd0141-4180-47cf-ab92-ada2ee783d14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.407 187156 DEBUG oslo_concurrency.lockutils [req-03072fe0-5693-40fa-bd99-67d03532d607 req-e8bd0141-4180-47cf-ab92-ada2ee783d14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.408 187156 DEBUG oslo_concurrency.lockutils [req-03072fe0-5693-40fa-bd99-67d03532d607 req-e8bd0141-4180-47cf-ab92-ada2ee783d14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.408 187156 DEBUG nova.compute.manager [req-03072fe0-5693-40fa-bd99-67d03532d607 req-e8bd0141-4180-47cf-ab92-ada2ee783d14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] No waiting events found dispatching network-vif-unplugged-52732bd8-180c-4935-84b3-9f7f3e46c276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:12 np0005539504 nova_compute[187152]: 2025-11-29 06:49:12.408 187156 DEBUG nova.compute.manager [req-03072fe0-5693-40fa-bd99-67d03532d607 req-e8bd0141-4180-47cf-ab92-ada2ee783d14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-vif-unplugged-52732bd8-180c-4935-84b3-9f7f3e46c276 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:13 np0005539504 nova_compute[187152]: 2025-11-29 06:49:13.080 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating tmpfile /var/lib/nova/instances/tmpi17kuexu to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 01:49:13 np0005539504 nova_compute[187152]: 2025-11-29 06:49:13.478 187156 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 01:49:13 np0005539504 nova_compute[187152]: 2025-11-29 06:49:13.661 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.065 187156 DEBUG nova.network.neutron [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.109 187156 INFO nova.compute.manager [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Took 1.92 seconds to deallocate network for instance.#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.233 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.234 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.324 187156 DEBUG nova.compute.provider_tree [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.341 187156 DEBUG nova.scheduler.client.report [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.368 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.406 187156 INFO nova.scheduler.client.report [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Deleted allocations for instance 4a10d3d0-27cb-4116-9924-cf8baaec591d#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.496 187156 DEBUG oslo_concurrency.lockutils [None req-1e381c37-1744-4095-98ef-e3091cd1ddf0 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.635 187156 DEBUG nova.compute.manager [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.635 187156 DEBUG oslo_concurrency.lockutils [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.636 187156 DEBUG oslo_concurrency.lockutils [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.636 187156 DEBUG oslo_concurrency.lockutils [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a10d3d0-27cb-4116-9924-cf8baaec591d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.636 187156 DEBUG nova.compute.manager [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] No waiting events found dispatching network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.636 187156 WARNING nova.compute.manager [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received unexpected event network-vif-plugged-52732bd8-180c-4935-84b3-9f7f3e46c276 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:49:14 np0005539504 nova_compute[187152]: 2025-11-29 06:49:14.637 187156 DEBUG nova.compute.manager [req-79291258-d146-4c25-92f0-8426b5887603 req-45b12b47-a08c-4bff-b8e2-637ef63b3d16 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Received event network-vif-deleted-52732bd8-180c-4935-84b3-9f7f3e46c276 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:15 np0005539504 nova_compute[187152]: 2025-11-29 06:49:15.301 187156 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 01:49:15 np0005539504 nova_compute[187152]: 2025-11-29 06:49:15.331 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:49:15 np0005539504 nova_compute[187152]: 2025-11-29 06:49:15.332 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:49:15 np0005539504 nova_compute[187152]: 2025-11-29 06:49:15.332 187156 DEBUG nova.network.neutron [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.005 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.689 187156 DEBUG nova.network.neutron [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.708 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.729 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.730 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating instance directory: /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.730 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating disk.info with the contents: {'/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk': 'qcow2', '/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.730 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.731 187156 DEBUG nova.objects.instance [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'trusted_certs' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.765 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.836 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.838 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.839 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.861 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.918 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:17 np0005539504 nova_compute[187152]: 2025-11-29 06:49:17.919 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.010 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk 1073741824" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.011 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.012 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.065 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.067 187156 DEBUG nova.virt.disk.api [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Checking if we can resize image /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.068 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.123 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.124 187156 DEBUG nova.virt.disk.api [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Cannot resize image /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.124 187156 DEBUG nova.objects.instance [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'migration_context' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.143 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.179 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config 485376" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.181 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config to /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.182 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.637 187156 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.640 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.642 187156 DEBUG nova.virt.libvirt.vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:49:08Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.642 187156 DEBUG nova.network.os_vif_util [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.643 187156 DEBUG nova.network.os_vif_util [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.644 187156 DEBUG os_vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.645 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.646 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.646 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.650 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60d45f94-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.651 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60d45f94-ad, col_values=(('external_ids', {'iface-id': '60d45f94-ad4f-48ba-a0a9-6b5406aa616c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:09:58', 'vm-uuid': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.653 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:18 np0005539504 NetworkManager[55210]: <info>  [1764398958.6564] manager: (tap60d45f94-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.661 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.663 187156 INFO os_vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad')#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.664 187156 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 01:49:18 np0005539504 nova_compute[187152]: 2025-11-29 06:49:18.664 187156 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.475 187156 DEBUG nova.network.neutron [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.486 187156 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 01:49:20 np0005539504 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:49:20 np0005539504 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:49:20 np0005539504 kernel: tap60d45f94-ad: entered promiscuous mode
Nov 29 01:49:20 np0005539504 NetworkManager[55210]: <info>  [1764398960.7827] manager: (tap60d45f94-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Nov 29 01:49:20 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:20Z|00045|binding|INFO|Claiming lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c for this additional chassis.
Nov 29 01:49:20 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:20Z|00046|binding|INFO|60d45f94-ad4f-48ba-a0a9-6b5406aa616c: Claiming fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.784 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.787 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:20 np0005539504 systemd-machined[153423]: New machine qemu-5-instance-00000007.
Nov 29 01:49:20 np0005539504 systemd-udevd[214909]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.839 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:20 np0005539504 NetworkManager[55210]: <info>  [1764398960.8442] device (tap60d45f94-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.844 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:20 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:20Z|00047|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c ovn-installed in OVS
Nov 29 01:49:20 np0005539504 nova_compute[187152]: 2025-11-29 06:49:20.845 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:20 np0005539504 NetworkManager[55210]: <info>  [1764398960.8460] device (tap60d45f94-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:49:20 np0005539504 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Nov 29 01:49:21 np0005539504 podman[214924]: 2025-11-29 06:49:21.783357276 +0000 UTC m=+0.098133435 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:49:21 np0005539504 podman[214925]: 2025-11-29 06:49:21.799227975 +0000 UTC m=+0.120671385 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:49:22 np0005539504 nova_compute[187152]: 2025-11-29 06:49:22.077 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398962.0764577, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:22 np0005539504 nova_compute[187152]: 2025-11-29 06:49:22.077 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Started (Lifecycle Event)#033[00m
Nov 29 01:49:22 np0005539504 nova_compute[187152]: 2025-11-29 06:49:22.112 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:22.903 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:22.904 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:22.904 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:23 np0005539504 nova_compute[187152]: 2025-11-29 06:49:23.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:23 np0005539504 nova_compute[187152]: 2025-11-29 06:49:23.663 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.427 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.428 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.457 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.793 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.794 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.800 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.801 187156 INFO nova.compute.claims [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.944 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398964.9440722, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:24 np0005539504 nova_compute[187152]: 2025-11-29 06:49:24.945 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.084 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.092 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.120 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.190 187156 DEBUG nova.compute.provider_tree [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.213 187156 DEBUG nova.scheduler.client.report [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.248 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.249 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.326 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.351 187156 INFO nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.385 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.524 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.525 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.526 187156 INFO nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating image(s)#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.526 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.527 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.527 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.541 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.597 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.599 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.600 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.622 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.699 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.701 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:25 np0005539504 podman[214990]: 2025-11-29 06:49:25.73307786 +0000 UTC m=+0.068454983 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.744 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.745 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.745 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.813 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.814 187156 DEBUG nova.virt.disk.api [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Checking if we can resize image /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.814 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.836 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.881 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.881 187156 DEBUG nova.virt.disk.api [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Cannot resize image /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:49:25 np0005539504 nova_compute[187152]: 2025-11-29 06:49:25.882 187156 DEBUG nova.objects.instance [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'migration_context' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.162 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.162 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Ensure instance console log exists: /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.162 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.163 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.163 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.164 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.169 187156 WARNING nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.174 187156 DEBUG nova.virt.libvirt.host [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.175 187156 DEBUG nova.virt.libvirt.host [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.179 187156 DEBUG nova.virt.libvirt.host [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.180 187156 DEBUG nova.virt.libvirt.host [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.182 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.183 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.183 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.184 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.184 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.184 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.184 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.184 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.185 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.185 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.185 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.185 187156 DEBUG nova.virt.hardware [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.191 187156 DEBUG nova.objects.instance [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'pci_devices' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.271 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <uuid>8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</uuid>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <name>instance-00000008</name>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAdmin275Test-server-1186663249</nova:name>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:49:26</nova:creationTime>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:user uuid="f5df51cabb4c4ab89ff71ecf62ae26a7">tempest-ServersAdmin275Test-1058580054-project-member</nova:user>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:        <nova:project uuid="b6074844dea7416191e6d3555118d49e">tempest-ServersAdmin275Test-1058580054</nova:project>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <entry name="serial">8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</entry>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <entry name="uuid">8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</entry>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/console.log" append="off"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:49:26 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:49:26 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:49:26 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:49:26 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.566 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.567 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.567 187156 INFO nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Using config drive#033[00m
Nov 29 01:49:26 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:26Z|00048|binding|INFO|Claiming lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c for this chassis.
Nov 29 01:49:26 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:26Z|00049|binding|INFO|60d45f94-ad4f-48ba-a0a9-6b5406aa616c: Claiming fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:49:26 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:26Z|00050|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c up in Southbound
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.695 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:09:58 10.100.0.4'], port_security=['fa:16:3e:86:09:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '21', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60d45f94-ad4f-48ba-a0a9-6b5406aa616c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.698 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 bound to our chassis#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.700 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.717 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fd633dc4-7a86-473e-ba71-18d928f0344d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.719 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24ee44f0-21 in ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.722 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24ee44f0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.722 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bf879d6f-9af4-4b0f-b770-2be78874ddca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.724 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fe98cdfc-8146-448c-92c2-ea907ecbae99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.742 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8ed865-c6ae-4b11-be1c-c33055df35c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.758 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[665117e1-6554-4469-acc8-34c0ba1a859a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.806 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0f6d96-c0a6-44f7-ab9f-30a4941d0ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 NetworkManager[55210]: <info>  [1764398966.8163] manager: (tap24ee44f0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.815 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f186e887-82d1-4fe8-9c07-7cc20b4d6d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 systemd-udevd[215031]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.858 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[29f68702-5534-4448-9388-0ceae8f80d45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.862 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c7ae01-39b1-48e5-8ab0-c5715ea8f9e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 NetworkManager[55210]: <info>  [1764398966.8900] device (tap24ee44f0-20): carrier: link connected
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.892 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[430ffdd8-e669-4466-a06c-62dac2a7c53e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.910 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[79ac486f-8150-41ac-9d72-4464b8928072]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442976, 'reachable_time': 31955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215050, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.927 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[92c6ff39-433b-4c9a-abcb-0e1c6fa3217c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:940c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442976, 'tstamp': 442976}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215051, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.929 187156 INFO nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating config drive at /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.935 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsmw8yp4f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.947 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5113477e-dd96-4e22-897f-5709bc8c07ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442976, 'reachable_time': 31955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215052, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.967 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398951.9659333, 4a10d3d0-27cb-4116-9924-cf8baaec591d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:26 np0005539504 nova_compute[187152]: 2025-11-29 06:49:26.967 187156 INFO nova.compute.manager [-] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:49:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:26.991 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7e809b-85ef-455d-9d87-fff86a51a9f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.028 187156 DEBUG nova.compute.manager [None req-db6e0810-5f28-4eda-81fe-5fb08ad40f3b - - - - - -] [instance: 4a10d3d0-27cb-4116-9924-cf8baaec591d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.033 187156 INFO nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Post operation of migration started#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.061 187156 DEBUG oslo_concurrency.processutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsmw8yp4f" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.068 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a39ed45a-a6cc-4096-9913-11a60ae2d250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.070 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.070 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.071 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24ee44f0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:27 np0005539504 kernel: tap24ee44f0-20: entered promiscuous mode
Nov 29 01:49:27 np0005539504 NetworkManager[55210]: <info>  [1764398967.0738] manager: (tap24ee44f0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.072 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.076 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24ee44f0-20, col_values=(('external_ids', {'iface-id': 'ffbd3b8f-7e45-45d4-84ce-cd74c712f992'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.078 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:27 np0005539504 ovn_controller[95182]: 2025-11-29T06:49:27Z|00051|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.080 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.081 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4066be57-77d4-4ae0-9130-8e531bc049a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.082 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:49:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:27.083 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'env', 'PROCESS_TAG=haproxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.090 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:27 np0005539504 systemd-machined[153423]: New machine qemu-6-instance-00000008.
Nov 29 01:49:27 np0005539504 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Nov 29 01:49:27 np0005539504 podman[215104]: 2025-11-29 06:49:27.515775199 +0000 UTC m=+0.064961917 container create be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:49:27 np0005539504 systemd[1]: Started libpod-conmon-be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0.scope.
Nov 29 01:49:27 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:49:27 np0005539504 podman[215104]: 2025-11-29 06:49:27.483443785 +0000 UTC m=+0.032630523 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:49:27 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ecb692f02ec5421fcdcc761d3854b10d5edb1926ee855ab58072de9f0ac2f16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:49:27 np0005539504 podman[215104]: 2025-11-29 06:49:27.597687085 +0000 UTC m=+0.146873823 container init be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.601 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.603 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.603 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398967.6026077, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.604 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:49:27 np0005539504 podman[215104]: 2025-11-29 06:49:27.607308665 +0000 UTC m=+0.156495383 container start be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.609 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance spawned successfully.#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.610 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.627 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.630 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.631 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.631 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.631 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:27 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [NOTICE]   (215129) : New worker (215131) forked
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.632 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:27 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [NOTICE]   (215129) : Loading success.
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.632 187156 DEBUG nova.virt.libvirt.driver [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.637 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.654 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.655 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.655 187156 DEBUG nova.network.neutron [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.676 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.677 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398967.6027002, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.677 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Started (Lifecycle Event)#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.721 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.724 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.754 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.755 187156 INFO nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Took 2.23 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.756 187156 DEBUG nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.868 187156 INFO nova.compute.manager [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Took 3.13 seconds to build instance.#033[00m
Nov 29 01:49:27 np0005539504 nova_compute[187152]: 2025-11-29 06:49:27.902 187156 DEBUG oslo_concurrency.lockutils [None req-49082d23-5604-4e2e-982a-175382886cdb f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:28 np0005539504 nova_compute[187152]: 2025-11-29 06:49:28.658 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:28 np0005539504 nova_compute[187152]: 2025-11-29 06:49:28.665 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:29 np0005539504 nova_compute[187152]: 2025-11-29 06:49:29.659 187156 INFO nova.compute.manager [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Rebuilding instance#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.076 187156 DEBUG nova.compute.manager [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.200 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'pci_requests' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.225 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'pci_devices' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.257 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'resources' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.281 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'migration_context' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.312 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:49:30 np0005539504 nova_compute[187152]: 2025-11-29 06:49:30.315 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:49:31 np0005539504 nova_compute[187152]: 2025-11-29 06:49:31.080 187156 DEBUG nova.network.neutron [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:31 np0005539504 nova_compute[187152]: 2025-11-29 06:49:31.200 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:31 np0005539504 nova_compute[187152]: 2025-11-29 06:49:31.326 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:31 np0005539504 nova_compute[187152]: 2025-11-29 06:49:31.326 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:31 np0005539504 nova_compute[187152]: 2025-11-29 06:49:31.327 187156 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:31 np0005539504 nova_compute[187152]: 2025-11-29 06:49:31.330 187156 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 01:49:31 np0005539504 virtqemud[186569]: Domain id=5 name='instance-00000007' uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e is tainted: custom-monitor
Nov 29 01:49:31 np0005539504 podman[215140]: 2025-11-29 06:49:31.732707912 +0000 UTC m=+0.070984011 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:49:32 np0005539504 nova_compute[187152]: 2025-11-29 06:49:32.338 187156 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.347 187156 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.351 187156 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.371 187156 DEBUG nova.objects.instance [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.666 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.668 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.668 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.668 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.705 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:49:33 np0005539504 nova_compute[187152]: 2025-11-29 06:49:33.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539504 nova_compute[187152]: 2025-11-29 06:49:34.246 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:34.247 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:34.248 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:49:35 np0005539504 nova_compute[187152]: 2025-11-29 06:49:35.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:35 np0005539504 nova_compute[187152]: 2025-11-29 06:49:35.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:35 np0005539504 nova_compute[187152]: 2025-11-29 06:49:35.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:36 np0005539504 nova_compute[187152]: 2025-11-29 06:49:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:38 np0005539504 nova_compute[187152]: 2025-11-29 06:49:38.706 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:38 np0005539504 nova_compute[187152]: 2025-11-29 06:49:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:38 np0005539504 nova_compute[187152]: 2025-11-29 06:49:38.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:49:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:49:39.251 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:39 np0005539504 podman[215172]: 2025-11-29 06:49:39.723779978 +0000 UTC m=+0.065848642 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 01:49:40 np0005539504 nova_compute[187152]: 2025-11-29 06:49:40.056 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:49:40 np0005539504 nova_compute[187152]: 2025-11-29 06:49:40.057 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:49:40 np0005539504 nova_compute[187152]: 2025-11-29 06:49:40.057 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:49:40 np0005539504 nova_compute[187152]: 2025-11-29 06:49:40.360 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:49:42 np0005539504 podman[215200]: 2025-11-29 06:49:42.71072086 +0000 UTC m=+0.050081285 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:49:42 np0005539504 podman[215201]: 2025-11-29 06:49:42.718214494 +0000 UTC m=+0.054497855 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git)
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.731 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.764 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.765 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.765 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.765 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.766 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.766 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.766 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.793 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.793 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.794 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.794 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.886 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.952 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:42 np0005539504 nova_compute[187152]: 2025-11-29 06:49:42.954 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.019 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.027 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.091 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.092 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.159 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.355 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.357 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5434MB free_disk=73.2839241027832GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.357 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.357 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.443 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance af865d23-0f24-47aa-aeab-1c12d04b5a1e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.444 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.444 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.444 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.494 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.508 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.531 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.532 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:43 np0005539504 nova_compute[187152]: 2025-11-29 06:49:43.708 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:43 np0005539504 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 29 01:49:43 np0005539504 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 13.820s CPU time.
Nov 29 01:49:43 np0005539504 systemd-machined[153423]: Machine qemu-6-instance-00000008 terminated.
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.377 187156 INFO nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance shutdown successfully after 14 seconds.#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.383 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance destroyed successfully.#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.387 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance destroyed successfully.#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.388 187156 INFO nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deleting instance files /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed_del#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.388 187156 INFO nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deletion of /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed_del complete#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.790 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.791 187156 INFO nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating image(s)#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.792 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.792 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.793 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.793 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:44 np0005539504 nova_compute[187152]: 2025-11-29 06:49:44.793 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.172 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.237 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.238 187156 DEBUG nova.virt.images [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] 3372b7b2-657b-4c4d-9d9d-7c5b771a630a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.239 187156 DEBUG nova.privsep.utils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.239 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.487 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.492 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.569 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.570 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.587 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.657 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.658 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.659 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.672 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.730 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.731 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.767 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.768 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.769 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.831 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.832 187156 DEBUG nova.virt.disk.api [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Checking if we can resize image /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.833 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.899 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.900 187156 DEBUG nova.virt.disk.api [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Cannot resize image /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.901 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.901 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Ensure instance console log exists: /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.901 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.901 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.902 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.903 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.907 187156 WARNING nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.921 187156 DEBUG nova.virt.libvirt.host [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.922 187156 DEBUG nova.virt.libvirt.host [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.929 187156 DEBUG nova.virt.libvirt.host [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.930 187156 DEBUG nova.virt.libvirt.host [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.931 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.931 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.932 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.932 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.932 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.932 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.932 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.933 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.933 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.933 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.933 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.933 187156 DEBUG nova.virt.hardware [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.934 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:46 np0005539504 nova_compute[187152]: 2025-11-29 06:49:46.957 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <uuid>8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</uuid>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <name>instance-00000008</name>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAdmin275Test-server-1186663249</nova:name>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:49:46</nova:creationTime>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:user uuid="f5df51cabb4c4ab89ff71ecf62ae26a7">tempest-ServersAdmin275Test-1058580054-project-member</nova:user>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:        <nova:project uuid="b6074844dea7416191e6d3555118d49e">tempest-ServersAdmin275Test-1058580054</nova:project>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <entry name="serial">8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</entry>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <entry name="uuid">8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</entry>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/console.log" append="off"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:49:46 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:49:46 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:49:46 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:49:46 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.036 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.036 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.037 187156 INFO nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Using config drive#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.053 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.127 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'keypairs' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.347 187156 INFO nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating config drive at /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.352 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkv9qwbqo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.479 187156 DEBUG oslo_concurrency.processutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkv9qwbqo" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:47 np0005539504 systemd-machined[153423]: New machine qemu-7-instance-00000008.
Nov 29 01:49:47 np0005539504 systemd[1]: Started Virtual Machine qemu-7-instance-00000008.
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.988 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.990 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398987.9872382, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.990 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.993 187156 DEBUG nova.compute.manager [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.994 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.998 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance spawned successfully.#033[00m
Nov 29 01:49:47 np0005539504 nova_compute[187152]: 2025-11-29 06:49:47.998 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.015 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.022 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.026 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.026 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.027 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.027 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.027 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.028 187156 DEBUG nova.virt.libvirt.driver [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.051 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.052 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764398987.9895284, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.052 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Started (Lifecycle Event)#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.082 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.085 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.117 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.134 187156 DEBUG nova.compute.manager [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.246 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.247 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.247 187156 DEBUG nova.objects.instance [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.346 187156 DEBUG oslo_concurrency.lockutils [None req-46980bc7-b32e-4e89-97c6-c0b19786208a f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:48 np0005539504 nova_compute[187152]: 2025-11-29 06:49:48.710 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:50 np0005539504 nova_compute[187152]: 2025-11-29 06:49:50.475 187156 INFO nova.compute.manager [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Rebuilding instance#033[00m
Nov 29 01:49:50 np0005539504 nova_compute[187152]: 2025-11-29 06:49:50.822 187156 DEBUG nova.compute.manager [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:50 np0005539504 nova_compute[187152]: 2025-11-29 06:49:50.925 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:50 np0005539504 nova_compute[187152]: 2025-11-29 06:49:50.943 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:50 np0005539504 nova_compute[187152]: 2025-11-29 06:49:50.966 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'resources' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:51 np0005539504 nova_compute[187152]: 2025-11-29 06:49:51.031 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'migration_context' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:51 np0005539504 nova_compute[187152]: 2025-11-29 06:49:51.048 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:49:51 np0005539504 nova_compute[187152]: 2025-11-29 06:49:51.051 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:49:52 np0005539504 podman[215326]: 2025-11-29 06:49:52.754233963 +0000 UTC m=+0.066222493 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:49:52 np0005539504 podman[215327]: 2025-11-29 06:49:52.790879154 +0000 UTC m=+0.116534392 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:49:53 np0005539504 nova_compute[187152]: 2025-11-29 06:49:53.712 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:56 np0005539504 podman[215374]: 2025-11-29 06:49:56.77098532 +0000 UTC m=+0.086198612 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:49:58 np0005539504 nova_compute[187152]: 2025-11-29 06:49:58.714 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:01 np0005539504 nova_compute[187152]: 2025-11-29 06:50:01.104 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:50:02 np0005539504 podman[215406]: 2025-11-29 06:50:02.748325979 +0000 UTC m=+0.071268099 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:50:03 np0005539504 nova_compute[187152]: 2025-11-29 06:50:03.715 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:03 np0005539504 nova_compute[187152]: 2025-11-29 06:50:03.717 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:04 np0005539504 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 29 01:50:04 np0005539504 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Consumed 14.541s CPU time.
Nov 29 01:50:04 np0005539504 systemd-machined[153423]: Machine qemu-7-instance-00000008 terminated.
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.126 187156 INFO nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance shutdown successfully after 14 seconds.#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.132 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance destroyed successfully.#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.137 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance destroyed successfully.#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.138 187156 INFO nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deleting instance files /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed_del#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.138 187156 INFO nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deletion of /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed_del complete#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.394 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.395 187156 INFO nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating image(s)#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.396 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Acquiring lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.396 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.397 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.414 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.482 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.484 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.484 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.495 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.558 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.559 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.594 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.595 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.596 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.651 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.652 187156 DEBUG nova.virt.disk.api [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Checking if we can resize image /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.652 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.709 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.710 187156 DEBUG nova.virt.disk.api [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Cannot resize image /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.711 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.711 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Ensure instance console log exists: /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.711 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.712 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.712 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.714 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.718 187156 WARNING nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.725 187156 DEBUG nova.virt.libvirt.host [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.726 187156 DEBUG nova.virt.libvirt.host [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.730 187156 DEBUG nova.virt.libvirt.host [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.730 187156 DEBUG nova.virt.libvirt.host [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.733 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.733 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.733 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.733 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.734 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.734 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.734 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.734 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.734 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.735 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.735 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.735 187156 DEBUG nova.virt.hardware [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.735 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.757 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <uuid>8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</uuid>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <name>instance-00000008</name>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAdmin275Test-server-1186663249</nova:name>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:50:05</nova:creationTime>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:user uuid="f5df51cabb4c4ab89ff71ecf62ae26a7">tempest-ServersAdmin275Test-1058580054-project-member</nova:user>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:        <nova:project uuid="b6074844dea7416191e6d3555118d49e">tempest-ServersAdmin275Test-1058580054</nova:project>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <entry name="serial">8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</entry>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <entry name="uuid">8eb4e0e8-1aad-4877-9acf-c9090d9f94ed</entry>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/console.log" append="off"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:50:05 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:50:05 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:50:05 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:50:05 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.810 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.811 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.811 187156 INFO nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Using config drive#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.831 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:05 np0005539504 nova_compute[187152]: 2025-11-29 06:50:05.867 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lazy-loading 'keypairs' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.101 187156 INFO nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Creating config drive at /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.113 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplpea9hq4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.241 187156 DEBUG oslo_concurrency.processutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplpea9hq4" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:06 np0005539504 systemd-machined[153423]: New machine qemu-8-instance-00000008.
Nov 29 01:50:06 np0005539504 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.602 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.603 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399006.6011906, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.603 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.606 187156 DEBUG nova.compute.manager [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.607 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.611 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance spawned successfully.#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.611 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.764 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.764 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.765 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.766 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.766 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.767 187156 DEBUG nova.virt.libvirt.driver [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.771 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:06 np0005539504 nova_compute[187152]: 2025-11-29 06:50:06.774 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.059 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.059 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399006.6019187, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.060 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Started (Lifecycle Event)#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.091 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.097 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.113 187156 DEBUG nova.compute.manager [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.120 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.203 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.204 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.204 187156 DEBUG nova.objects.instance [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.297 187156 DEBUG oslo_concurrency.lockutils [None req-34fc7a73-527f-4d9c-a292-9e1c0dd29f9c f93f83f9bbc64d30884f131134db09a5 6a687d2d47df4581bc776e20e2c35d16 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.873 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.873 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.874 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.874 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.874 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.886 187156 INFO nova.compute.manager [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Terminating instance#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.941 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "refresh_cache-8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.941 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquired lock "refresh_cache-8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:07 np0005539504 nova_compute[187152]: 2025-11-29 06:50:07.941 187156 DEBUG nova.network.neutron [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.152 187156 DEBUG nova.network.neutron [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.478 187156 DEBUG nova.network.neutron [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.504 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Releasing lock "refresh_cache-8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.504 187156 DEBUG nova.compute.manager [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:50:08 np0005539504 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 29 01:50:08 np0005539504 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 2.242s CPU time.
Nov 29 01:50:08 np0005539504 systemd-machined[153423]: Machine qemu-8-instance-00000008 terminated.
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.716 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.754 187156 INFO nova.virt.libvirt.driver [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance destroyed successfully.#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.756 187156 DEBUG nova.objects.instance [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lazy-loading 'resources' on Instance uuid 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.776 187156 INFO nova.virt.libvirt.driver [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deleting instance files /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed_del#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.777 187156 INFO nova.virt.libvirt.driver [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deletion of /var/lib/nova/instances/8eb4e0e8-1aad-4877-9acf-c9090d9f94ed_del complete#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.859 187156 INFO nova.compute.manager [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.860 187156 DEBUG oslo.service.loopingcall [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.860 187156 DEBUG nova.compute.manager [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:50:08 np0005539504 nova_compute[187152]: 2025-11-29 06:50:08.860 187156 DEBUG nova.network.neutron [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.339 187156 DEBUG nova.network.neutron [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.358 187156 DEBUG nova.network.neutron [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.379 187156 INFO nova.compute.manager [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Took 0.52 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.484 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.485 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.571 187156 DEBUG nova.compute.provider_tree [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.588 187156 DEBUG nova.scheduler.client.report [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.611 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.652 187156 INFO nova.scheduler.client.report [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Deleted allocations for instance 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed#033[00m
Nov 29 01:50:09 np0005539504 nova_compute[187152]: 2025-11-29 06:50:09.772 187156 DEBUG oslo_concurrency.lockutils [None req-b4d296f9-4f53-4d16-aa8b-286ae9045665 f5df51cabb4c4ab89ff71ecf62ae26a7 b6074844dea7416191e6d3555118d49e - - default default] Lock "8eb4e0e8-1aad-4877-9acf-c9090d9f94ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:10 np0005539504 podman[215488]: 2025-11-29 06:50:10.729049535 +0000 UTC m=+0.065179874 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 01:50:13 np0005539504 podman[215507]: 2025-11-29 06:50:13.703103729 +0000 UTC m=+0.047896037 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:50:13 np0005539504 podman[215508]: 2025-11-29 06:50:13.711162956 +0000 UTC m=+0.055602164 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Nov 29 01:50:13 np0005539504 nova_compute[187152]: 2025-11-29 06:50:13.718 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539504 nova_compute[187152]: 2025-11-29 06:50:18.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:50:18 np0005539504 nova_compute[187152]: 2025-11-29 06:50:18.765 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:50:18 np0005539504 nova_compute[187152]: 2025-11-29 06:50:18.766 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 01:50:18 np0005539504 nova_compute[187152]: 2025-11-29 06:50:18.766 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:50:18 np0005539504 nova_compute[187152]: 2025-11-29 06:50:18.766 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:50:18 np0005539504 nova_compute[187152]: 2025-11-29 06:50:18.767 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:22.904 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:22.905 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:22.905 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:23 np0005539504 podman[215554]: 2025-11-29 06:50:23.746896019 +0000 UTC m=+0.080412026 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:50:23 np0005539504 nova_compute[187152]: 2025-11-29 06:50:23.753 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399008.7519064, 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:23 np0005539504 nova_compute[187152]: 2025-11-29 06:50:23.754 187156 INFO nova.compute.manager [-] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:23 np0005539504 nova_compute[187152]: 2025-11-29 06:50:23.768 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:23 np0005539504 nova_compute[187152]: 2025-11-29 06:50:23.769 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:23 np0005539504 podman[215555]: 2025-11-29 06:50:23.790746265 +0000 UTC m=+0.126681907 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:50:23 np0005539504 nova_compute[187152]: 2025-11-29 06:50:23.872 187156 DEBUG nova.compute.manager [None req-5bc3578c-8cfe-48f5-a773-ce8a2d3a1965 - - - - - -] [instance: 8eb4e0e8-1aad-4877-9acf-c9090d9f94ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:27 np0005539504 podman[215603]: 2025-11-29 06:50:27.760520802 +0000 UTC m=+0.089711618 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:28 np0005539504 nova_compute[187152]: 2025-11-29 06:50:28.770 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:28 np0005539504 nova_compute[187152]: 2025-11-29 06:50:28.771 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:32 np0005539504 nova_compute[187152]: 2025-11-29 06:50:32.866 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "8808e65f-ef4e-4f64-ba70-99b9260f5450" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:32 np0005539504 nova_compute[187152]: 2025-11-29 06:50:32.866 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "8808e65f-ef4e-4f64-ba70-99b9260f5450" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.133 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.247 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.248 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.253 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.254 187156 INFO nova.compute.claims [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.393 187156 DEBUG nova.compute.provider_tree [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:33 np0005539504 podman[215625]: 2025-11-29 06:50:33.755146668 +0000 UTC m=+0.088623197 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.772 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.774 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:33 np0005539504 nova_compute[187152]: 2025-11-29 06:50:33.845 187156 DEBUG nova.scheduler.client.report [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.003 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.004 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.107 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.108 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.109 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.109 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.109 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.120 187156 INFO nova.compute.manager [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Terminating instance#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.153 187156 DEBUG nova.compute.manager [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.230 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.231 187156 DEBUG nova.network.neutron [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:50:34 np0005539504 kernel: tap60d45f94-ad (unregistering): left promiscuous mode
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.328 187156 INFO nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:50:34 np0005539504 NetworkManager[55210]: <info>  [1764399034.3363] device (tap60d45f94-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.340 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:50:34Z|00052|binding|INFO|Releasing lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c from this chassis (sb_readonly=0)
Nov 29 01:50:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:50:34Z|00053|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c down in Southbound
Nov 29 01:50:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:50:34Z|00054|binding|INFO|Removing iface tap60d45f94-ad ovn-installed in OVS
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.342 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.346 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.358 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:09:58 10.100.0.4'], port_security=['fa:16:3e:86:09:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '24', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60d45f94-ad4f-48ba-a0a9-6b5406aa616c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.360 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.362 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.365 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.366 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a85ada-9914-4c6e-90f1-e469305a2ed5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.367 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace which is not needed anymore#033[00m
Nov 29 01:50:34 np0005539504 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 29 01:50:34 np0005539504 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 4.683s CPU time.
Nov 29 01:50:34 np0005539504 systemd-machined[153423]: Machine qemu-5-instance-00000007 terminated.
Nov 29 01:50:34 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [NOTICE]   (215129) : haproxy version is 2.8.14-c23fe91
Nov 29 01:50:34 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [NOTICE]   (215129) : path to executable is /usr/sbin/haproxy
Nov 29 01:50:34 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [WARNING]  (215129) : Exiting Master process...
Nov 29 01:50:34 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [ALERT]    (215129) : Current worker (215131) exited with code 143 (Terminated)
Nov 29 01:50:34 np0005539504 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215124]: [WARNING]  (215129) : All workers exited. Exiting... (0)
Nov 29 01:50:34 np0005539504 systemd[1]: libpod-be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0.scope: Deactivated successfully.
Nov 29 01:50:34 np0005539504 conmon[215124]: conmon be1d7ce9bf283ac8aba2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0.scope/container/memory.events
Nov 29 01:50:34 np0005539504 podman[215672]: 2025-11-29 06:50:34.540516311 +0000 UTC m=+0.065551924 container died be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.575 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.580 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.589 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.591 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.591 187156 INFO nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Creating image(s)#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.592 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "/var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.592 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "/var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.593 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "/var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0-userdata-shm.mount: Deactivated successfully.
Nov 29 01:50:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9ecb692f02ec5421fcdcc761d3854b10d5edb1926ee855ab58072de9f0ac2f16-merged.mount: Deactivated successfully.
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.609 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:34 np0005539504 podman[215672]: 2025-11-29 06:50:34.613900606 +0000 UTC m=+0.138936229 container cleanup be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:50:34 np0005539504 systemd[1]: libpod-conmon-be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0.scope: Deactivated successfully.
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.633 187156 INFO nova.virt.libvirt.driver [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance destroyed successfully.#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.634 187156 DEBUG nova.objects.instance [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lazy-loading 'resources' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.648 187156 DEBUG nova.virt.libvirt.vif [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:49:33Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.649 187156 DEBUG nova.network.os_vif_util [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.650 187156 DEBUG nova.network.os_vif_util [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.650 187156 DEBUG os_vif [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.654 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60d45f94-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.662 187156 INFO os_vif [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad')#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.662 187156 INFO nova.virt.libvirt.driver [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deleting instance files /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e_del#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.662 187156 INFO nova.virt.libvirt.driver [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deletion of /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e_del complete#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.672 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.673 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.674 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.683 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.703 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.717 187156 DEBUG nova.network.neutron [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.717 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.737 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.738 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.771 187156 DEBUG nova.compute.manager [req-c5c2cd9d-9e68-4120-ae5f-a881674df63c req-1ba6428d-b009-4bb1-84c0-c0f2946aa37f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.772 187156 DEBUG oslo_concurrency.lockutils [req-c5c2cd9d-9e68-4120-ae5f-a881674df63c req-1ba6428d-b009-4bb1-84c0-c0f2946aa37f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.772 187156 DEBUG oslo_concurrency.lockutils [req-c5c2cd9d-9e68-4120-ae5f-a881674df63c req-1ba6428d-b009-4bb1-84c0-c0f2946aa37f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.772 187156 DEBUG oslo_concurrency.lockutils [req-c5c2cd9d-9e68-4120-ae5f-a881674df63c req-1ba6428d-b009-4bb1-84c0-c0f2946aa37f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.772 187156 DEBUG nova.compute.manager [req-c5c2cd9d-9e68-4120-ae5f-a881674df63c req-1ba6428d-b009-4bb1-84c0-c0f2946aa37f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.772 187156 DEBUG nova.compute.manager [req-c5c2cd9d-9e68-4120-ae5f-a881674df63c req-1ba6428d-b009-4bb1-84c0-c0f2946aa37f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.779 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.780 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.780 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:34 np0005539504 podman[215717]: 2025-11-29 06:50:34.808364386 +0000 UTC m=+0.172799225 container remove be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.813 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ea28b564-71c2-4208-b6ee-814c407acbf8]: (4, ('Sat Nov 29 06:50:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0)\nbe1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0\nSat Nov 29 06:50:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (be1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0)\nbe1d7ce9bf283ac8aba2382853b3e6b94d479bdab8698ad9e06b7c65b56223e0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.816 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[025714bc-5fa7-43f7-b307-f0c46e034697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.817 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:34 np0005539504 kernel: tap24ee44f0-20: left promiscuous mode
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.820 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.832 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.836 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.837 187156 DEBUG nova.virt.disk.api [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Checking if we can resize image /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.837 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.837 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[02e2bf8d-e5e2-4b1c-ac2e-77e2437c924a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.851 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b7c04a-6c7a-48c4-898f-f8be6bd1dffe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.852 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[82adb6ba-8d40-4a38-9980-beda238037cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.864 187156 INFO nova.compute.manager [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Took 0.71 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.865 187156 DEBUG oslo.service.loopingcall [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.865 187156 DEBUG nova.compute.manager [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.865 187156 DEBUG nova.network.neutron [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.870 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9e7849-f92d-4395-b62c-588e8ac9174b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442967, 'reachable_time': 28420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215744, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 systemd[1]: run-netns-ovnmeta\x2d24ee44f0\x2d2b10\x2d459c\x2daabf\x2dbf9ef2c8d950.mount: Deactivated successfully.
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.876 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.876 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[43887ae6-43c9-4383-af4e-f0bf67998dd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.898 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.899 187156 DEBUG nova.virt.disk.api [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Cannot resize image /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.899 187156 DEBUG nova.objects.instance [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lazy-loading 'migration_context' on Instance uuid 8808e65f-ef4e-4f64-ba70-99b9260f5450 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.906 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.906 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:34.907 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.919 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.920 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Ensure instance console log exists: /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.920 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.921 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.921 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.922 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.929 187156 WARNING nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.934 187156 DEBUG nova.virt.libvirt.host [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.935 187156 DEBUG nova.virt.libvirt.host [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.937 187156 DEBUG nova.virt.libvirt.host [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.938 187156 DEBUG nova.virt.libvirt.host [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.939 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.940 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.940 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.940 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.941 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.941 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.941 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.941 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.942 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.942 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.942 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.942 187156 DEBUG nova.virt.hardware [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.946 187156 DEBUG nova.objects.instance [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8808e65f-ef4e-4f64-ba70-99b9260f5450 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:34 np0005539504 nova_compute[187152]: 2025-11-29 06:50:34.971 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <uuid>8808e65f-ef4e-4f64-ba70-99b9260f5450</uuid>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <name>instance-0000000e</name>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-436498485</nova:name>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:50:34</nova:creationTime>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:user uuid="7554120578c443aeb4b37d4ac60be1e6">tempest-ServersAdminNegativeTestJSON-633564556-project-member</nova:user>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:        <nova:project uuid="8edc6838ec0a494a86a17e1f5d0d039a">tempest-ServersAdminNegativeTestJSON-633564556</nova:project>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <entry name="serial">8808e65f-ef4e-4f64-ba70-99b9260f5450</entry>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <entry name="uuid">8808e65f-ef4e-4f64-ba70-99b9260f5450</entry>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.config"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/console.log" append="off"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:50:34 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:50:34 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:50:34 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:50:34 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:50:35 np0005539504 nova_compute[187152]: 2025-11-29 06:50:35.455 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:35 np0005539504 nova_compute[187152]: 2025-11-29 06:50:35.456 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:35 np0005539504 nova_compute[187152]: 2025-11-29 06:50:35.457 187156 INFO nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Using config drive#033[00m
Nov 29 01:50:35 np0005539504 nova_compute[187152]: 2025-11-29 06:50:35.782 187156 INFO nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Creating config drive at /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.config#033[00m
Nov 29 01:50:35 np0005539504 nova_compute[187152]: 2025-11-29 06:50:35.787 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo3t2fca_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:35 np0005539504 nova_compute[187152]: 2025-11-29 06:50:35.917 187156 DEBUG oslo_concurrency.processutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo3t2fca_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:35 np0005539504 systemd-machined[153423]: New machine qemu-9-instance-0000000e.
Nov 29 01:50:36 np0005539504 systemd[1]: Started Virtual Machine qemu-9-instance-0000000e.
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.084 187156 DEBUG nova.network.neutron [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.196 187156 INFO nova.compute.manager [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Took 1.33 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.272 187156 DEBUG nova.compute.manager [req-519213f5-2e04-4336-ab93-5cdc9968ebd9 req-caedc397-b94a-4250-9516-23c29be74096 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-deleted-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.333 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399036.3327875, 8808e65f-ef4e-4f64-ba70-99b9260f5450 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.333 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.336 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.336 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.346 187156 INFO nova.virt.libvirt.driver [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance spawned successfully.#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.347 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.352 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.353 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.355 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.364 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.369 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.369 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.369 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.370 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.370 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.370 187156 DEBUG nova.virt.libvirt.driver [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.393 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.394 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399036.3356204, 8808e65f-ef4e-4f64-ba70-99b9260f5450 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.394 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] VM Started (Lifecycle Event)#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.419 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.423 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.433 187156 DEBUG nova.compute.provider_tree [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.568 187156 DEBUG nova.scheduler.client.report [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.572 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.578 187156 INFO nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Took 1.99 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.579 187156 DEBUG nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.669 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.704 187156 INFO nova.scheduler.client.report [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Deleted allocations for instance af865d23-0f24-47aa-aeab-1c12d04b5a1e#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.834 187156 INFO nova.compute.manager [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Took 3.63 seconds to build instance.#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.861 187156 DEBUG oslo_concurrency.lockutils [None req-4bed62b2-8c2e-486b-a2cc-2b4367b2cdde 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "8808e65f-ef4e-4f64-ba70-99b9260f5450" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.891 187156 DEBUG oslo_concurrency.lockutils [None req-1c4781bd-6d71-4792-b418-7c7d9d0a31f9 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.984 187156 DEBUG nova.compute.manager [req-c06a7cb4-b5ed-4ae5-ad30-4d38a86a8b47 req-24ddd008-9f94-48b7-917b-56f59de4507b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.984 187156 DEBUG oslo_concurrency.lockutils [req-c06a7cb4-b5ed-4ae5-ad30-4d38a86a8b47 req-24ddd008-9f94-48b7-917b-56f59de4507b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.985 187156 DEBUG oslo_concurrency.lockutils [req-c06a7cb4-b5ed-4ae5-ad30-4d38a86a8b47 req-24ddd008-9f94-48b7-917b-56f59de4507b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.985 187156 DEBUG oslo_concurrency.lockutils [req-c06a7cb4-b5ed-4ae5-ad30-4d38a86a8b47 req-24ddd008-9f94-48b7-917b-56f59de4507b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.985 187156 DEBUG nova.compute.manager [req-c06a7cb4-b5ed-4ae5-ad30-4d38a86a8b47 req-24ddd008-9f94-48b7-917b-56f59de4507b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:36 np0005539504 nova_compute[187152]: 2025-11-29 06:50:36.985 187156 WARNING nova.compute.manager [req-c06a7cb4-b5ed-4ae5-ad30-4d38a86a8b47 req-24ddd008-9f94-48b7-917b-56f59de4507b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:50:37 np0005539504 nova_compute[187152]: 2025-11-29 06:50:37.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.035 187156 DEBUG nova.objects.instance [None req-61026c2e-f428-411b-8c93-7c0d5ff921e9 157c698cf29047c09d6474fbe775a9a6 33a821ace3b64abdbec79a7fdd22e20a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8808e65f-ef4e-4f64-ba70-99b9260f5450 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.055 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399038.055121, 8808e65f-ef4e-4f64-ba70-99b9260f5450 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.056 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.074 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.078 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.098 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.801 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:38 np0005539504 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Nov 29 01:50:38 np0005539504 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000e.scope: Consumed 2.124s CPU time.
Nov 29 01:50:38 np0005539504 systemd-machined[153423]: Machine qemu-9-instance-0000000e terminated.
Nov 29 01:50:38 np0005539504 nova_compute[187152]: 2025-11-29 06:50:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:39 np0005539504 nova_compute[187152]: 2025-11-29 06:50:39.027 187156 DEBUG nova.compute.manager [None req-61026c2e-f428-411b-8c93-7c0d5ff921e9 157c698cf29047c09d6474fbe775a9a6 33a821ace3b64abdbec79a7fdd22e20a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:39 np0005539504 nova_compute[187152]: 2025-11-29 06:50:39.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:39 np0005539504 nova_compute[187152]: 2025-11-29 06:50:39.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:39 np0005539504 nova_compute[187152]: 2025-11-29 06:50:39.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:50:39 np0005539504 nova_compute[187152]: 2025-11-29 06:50:39.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:50:40 np0005539504 nova_compute[187152]: 2025-11-29 06:50:40.217 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539504 nova_compute[187152]: 2025-11-29 06:50:40.377 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-8808e65f-ef4e-4f64-ba70-99b9260f5450" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:40 np0005539504 nova_compute[187152]: 2025-11-29 06:50:40.378 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-8808e65f-ef4e-4f64-ba70-99b9260f5450" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:40 np0005539504 nova_compute[187152]: 2025-11-29 06:50:40.378 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:50:40 np0005539504 nova_compute[187152]: 2025-11-29 06:50:40.379 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8808e65f-ef4e-4f64-ba70-99b9260f5450 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:40 np0005539504 nova_compute[187152]: 2025-11-29 06:50:40.626 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.214 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.229 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-8808e65f-ef4e-4f64-ba70-99b9260f5450" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.230 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.230 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.230 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.250 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.250 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.250 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.251 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.318 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:41 np0005539504 podman[215787]: 2025-11-29 06:50:41.368294282 +0000 UTC m=+0.081556657 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.375 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.377 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.434 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.627 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.629 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5693MB free_disk=73.24427795410156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.629 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.630 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.750 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 8808e65f-ef4e-4f64-ba70-99b9260f5450 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.751 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.751 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.807 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.820 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.852 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:50:41 np0005539504 nova_compute[187152]: 2025-11-29 06:50:41.852 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:42 np0005539504 nova_compute[187152]: 2025-11-29 06:50:42.559 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:42 np0005539504 nova_compute[187152]: 2025-11-29 06:50:42.561 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:50:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:50:42.909 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.450 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "8808e65f-ef4e-4f64-ba70-99b9260f5450" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.451 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "8808e65f-ef4e-4f64-ba70-99b9260f5450" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.452 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "8808e65f-ef4e-4f64-ba70-99b9260f5450-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.452 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "8808e65f-ef4e-4f64-ba70-99b9260f5450-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.452 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "8808e65f-ef4e-4f64-ba70-99b9260f5450-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.464 187156 INFO nova.compute.manager [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Terminating instance#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.476 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "refresh_cache-8808e65f-ef4e-4f64-ba70-99b9260f5450" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.477 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquired lock "refresh_cache-8808e65f-ef4e-4f64-ba70-99b9260f5450" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.477 187156 DEBUG nova.network.neutron [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.709 187156 DEBUG nova.network.neutron [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:43 np0005539504 nova_compute[187152]: 2025-11-29 06:50:43.802 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.451 187156 DEBUG nova.network.neutron [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.513 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Releasing lock "refresh_cache-8808e65f-ef4e-4f64-ba70-99b9260f5450" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.514 187156 DEBUG nova.compute.manager [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.524 187156 INFO nova.virt.libvirt.driver [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance destroyed successfully.#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.525 187156 DEBUG nova.objects.instance [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lazy-loading 'resources' on Instance uuid 8808e65f-ef4e-4f64-ba70-99b9260f5450 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.627 187156 INFO nova.virt.libvirt.driver [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Deleting instance files /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450_del#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.628 187156 INFO nova.virt.libvirt.driver [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Deletion of /var/lib/nova/instances/8808e65f-ef4e-4f64-ba70-99b9260f5450_del complete#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.659 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:44 np0005539504 podman[215812]: 2025-11-29 06:50:44.710603218 +0000 UTC m=+0.056637764 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:50:44 np0005539504 podman[215813]: 2025-11-29 06:50:44.717130064 +0000 UTC m=+0.057196839 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9)
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.738 187156 INFO nova.compute.manager [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Took 0.22 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.738 187156 DEBUG oslo.service.loopingcall [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.738 187156 DEBUG nova.compute.manager [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:50:44 np0005539504 nova_compute[187152]: 2025-11-29 06:50:44.738 187156 DEBUG nova.network.neutron [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.100 187156 DEBUG nova.network.neutron [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.115 187156 DEBUG nova.network.neutron [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.129 187156 INFO nova.compute.manager [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Took 0.39 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.224 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.225 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.281 187156 DEBUG nova.compute.provider_tree [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.300 187156 DEBUG nova.scheduler.client.report [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.387 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.422 187156 INFO nova.scheduler.client.report [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Deleted allocations for instance 8808e65f-ef4e-4f64-ba70-99b9260f5450#033[00m
Nov 29 01:50:45 np0005539504 nova_compute[187152]: 2025-11-29 06:50:45.515 187156 DEBUG oslo_concurrency.lockutils [None req-97988f07-c781-4b4f-838f-127aab0010ab 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "8808e65f-ef4e-4f64-ba70-99b9260f5450" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.509 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.510 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.558 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.674 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.674 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.681 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.681 187156 INFO nova.compute.claims [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:50:46 np0005539504 nova_compute[187152]: 2025-11-29 06:50:46.899 187156 DEBUG nova.compute.provider_tree [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.031 187156 DEBUG nova.scheduler.client.report [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.245 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.246 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.307 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.324 187156 INFO nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.342 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.450 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.451 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.451 187156 INFO nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating image(s)#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.452 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.452 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.453 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.465 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.519 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.520 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.521 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.531 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.594 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.595 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.633 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.634 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.634 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.686 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.688 187156 DEBUG nova.virt.disk.api [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Checking if we can resize image /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.688 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.749 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.751 187156 DEBUG nova.virt.disk.api [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Cannot resize image /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.752 187156 DEBUG nova.objects.instance [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'migration_context' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.769 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.770 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Ensure instance console log exists: /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.770 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.771 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.771 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.773 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.780 187156 WARNING nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.787 187156 DEBUG nova.virt.libvirt.host [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.788 187156 DEBUG nova.virt.libvirt.host [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.792 187156 DEBUG nova.virt.libvirt.host [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.793 187156 DEBUG nova.virt.libvirt.host [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.794 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.794 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.795 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.795 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.795 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.796 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.796 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.796 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.797 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.797 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.797 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.797 187156 DEBUG nova.virt.hardware [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.803 187156 DEBUG nova.objects.instance [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.825 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <uuid>0b850e95-2727-4d2f-afa1-7a755670a387</uuid>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <name>instance-00000010</name>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-188422497</nova:name>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:50:47</nova:creationTime>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:user uuid="9399c90511c44462b8092380bad3cfdc">tempest-UnshelveToHostMultiNodesTest-1888846715-project-member</nova:user>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:        <nova:project uuid="2ad5553710d5496dafe785396586bef5">tempest-UnshelveToHostMultiNodesTest-1888846715</nova:project>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <entry name="serial">0b850e95-2727-4d2f-afa1-7a755670a387</entry>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <entry name="uuid">0b850e95-2727-4d2f-afa1-7a755670a387</entry>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/console.log" append="off"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:50:47 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:50:47 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:50:47 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:50:47 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.905 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.905 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:47 np0005539504 nova_compute[187152]: 2025-11-29 06:50:47.906 187156 INFO nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Using config drive#033[00m
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.955 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0b850e95-2727-4d2f-afa1-7a755670a387', 'name': 'tempest-UnshelveToHostMultiNodesTest-server-188422497', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '2ad5553710d5496dafe785396586bef5', 'user_id': '9399c90511c44462b8092380bad3cfdc', 'hostId': 'aa598bf899b95d3bf984ba7ab9f0926b0df688fa9d17c3ee139209da', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.958 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.959 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.959 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.960 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.961 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.962 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.963 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.964 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.964 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.964 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>]
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.965 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.966 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.967 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.967 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.967 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.968 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.970 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.971 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.971 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.972 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.973 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.973 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.973 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>]
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.974 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.975 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.976 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.977 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.977 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.977 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>]
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.979 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.979 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-188422497>]
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.981 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:50:47.981 12 DEBUG ceilometer.compute.pollsters [-] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000010, id=0b850e95-2727-4d2f-afa1-7a755670a387>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.138 187156 INFO nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating config drive at /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.144 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphyky1jyc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.269 187156 DEBUG oslo_concurrency.processutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphyky1jyc" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:48 np0005539504 systemd-machined[153423]: New machine qemu-10-instance-00000010.
Nov 29 01:50:48 np0005539504 systemd[1]: Started Virtual Machine qemu-10-instance-00000010.
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.790 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399048.7896512, 0b850e95-2727-4d2f-afa1-7a755670a387 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.791 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.794 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.794 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.797 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance spawned successfully.#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.797 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.914 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.916 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.920 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.922 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.923 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.923 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.924 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.924 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.924 187156 DEBUG nova.virt.libvirt.driver [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.952 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.953 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399048.7907798, 0b850e95-2727-4d2f-afa1-7a755670a387 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.953 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Started (Lifecycle Event)#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.994 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:48 np0005539504 nova_compute[187152]: 2025-11-29 06:50:48.996 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.025 187156 INFO nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Took 1.57 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.026 187156 DEBUG nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.028 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.142 187156 INFO nova.compute.manager [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Took 2.51 seconds to build instance.#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.167 187156 DEBUG oslo_concurrency.lockutils [None req-21769f52-0a3c-457b-b06a-d952ecab2fd2 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.630 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399034.6192973, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.630 187156 INFO nova.compute.manager [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.654 187156 DEBUG nova.compute.manager [None req-73bc42f5-4c7f-4016-bf12-192f6a632415 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:49 np0005539504 nova_compute[187152]: 2025-11-29 06:50:49.660 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:53 np0005539504 nova_compute[187152]: 2025-11-29 06:50:53.914 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.029 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399039.0287175, 8808e65f-ef4e-4f64-ba70-99b9260f5450 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.030 187156 INFO nova.compute.manager [-] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.658 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.659 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.659 187156 INFO nova.compute.manager [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Shelving#033[00m
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.662 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:54 np0005539504 podman[215893]: 2025-11-29 06:50:54.742544055 +0000 UTC m=+0.086619663 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:50:54 np0005539504 podman[215894]: 2025-11-29 06:50:54.768747414 +0000 UTC m=+0.108785493 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:54 np0005539504 nova_compute[187152]: 2025-11-29 06:50:54.987 187156 DEBUG nova.compute.manager [None req-cd0f9adf-d776-42a4-a395-6b62c5b3422e - - - - - -] [instance: 8808e65f-ef4e-4f64-ba70-99b9260f5450] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:55 np0005539504 nova_compute[187152]: 2025-11-29 06:50:55.949 187156 DEBUG nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:50:58 np0005539504 podman[215939]: 2025-11-29 06:50:58.736247614 +0000 UTC m=+0.076553382 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:50:58 np0005539504 nova_compute[187152]: 2025-11-29 06:50:58.916 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:59 np0005539504 nova_compute[187152]: 2025-11-29 06:50:59.665 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:03 np0005539504 nova_compute[187152]: 2025-11-29 06:51:03.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:04 np0005539504 nova_compute[187152]: 2025-11-29 06:51:04.667 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:04 np0005539504 podman[215971]: 2025-11-29 06:51:04.737931071 +0000 UTC m=+0.080670682 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:51:05 np0005539504 nova_compute[187152]: 2025-11-29 06:51:05.995 187156 DEBUG nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:51:08 np0005539504 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 29 01:51:08 np0005539504 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Consumed 13.809s CPU time.
Nov 29 01:51:08 np0005539504 systemd-machined[153423]: Machine qemu-10-instance-00000010 terminated.
Nov 29 01:51:08 np0005539504 nova_compute[187152]: 2025-11-29 06:51:08.972 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.011 187156 INFO nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.017 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.018 187156 DEBUG nova.objects.instance [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.507 187156 INFO nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Beginning cold snapshot process#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.710 187156 DEBUG nova.privsep.utils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.711 187156 DEBUG oslo_concurrency.processutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk /var/lib/nova/instances/snapshots/tmpvri_x2gq/8d337681994d4c21b08af9df80d1a63b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:09 np0005539504 nova_compute[187152]: 2025-11-29 06:51:09.728 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:10 np0005539504 nova_compute[187152]: 2025-11-29 06:51:10.165 187156 DEBUG oslo_concurrency.processutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk /var/lib/nova/instances/snapshots/tmpvri_x2gq/8d337681994d4c21b08af9df80d1a63b" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:10 np0005539504 nova_compute[187152]: 2025-11-29 06:51:10.166 187156 INFO nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:51:11 np0005539504 podman[216008]: 2025-11-29 06:51:11.738588154 +0000 UTC m=+0.081634839 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.649 187156 INFO nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Snapshot image upload complete#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.650 187156 DEBUG nova.compute.manager [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.787 187156 INFO nova.compute.manager [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Shelve offloading#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.808 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.809 187156 DEBUG nova.compute.manager [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.813 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.813 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquired lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:12 np0005539504 nova_compute[187152]: 2025-11-29 06:51:12.814 187156 DEBUG nova.network.neutron [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.106 187156 DEBUG nova.network.neutron [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.429 187156 DEBUG nova.network.neutron [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.450 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Releasing lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.456 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.457 187156 DEBUG nova.objects.instance [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'resources' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.470 187156 INFO nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deleting instance files /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387_del#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.476 187156 INFO nova.virt.libvirt.driver [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deletion of /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387_del complete#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.611 187156 INFO nova.scheduler.client.report [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Deleted allocations for instance 0b850e95-2727-4d2f-afa1-7a755670a387#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.678 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.679 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.707 187156 DEBUG nova.compute.provider_tree [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.721 187156 DEBUG nova.scheduler.client.report [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.742 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.816 187156 DEBUG oslo_concurrency.lockutils [None req-576352fe-0aff-48b3-b0fe-ecc796431a28 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:13 np0005539504 nova_compute[187152]: 2025-11-29 06:51:13.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539504 nova_compute[187152]: 2025-11-29 06:51:14.733 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:51:15Z|00055|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Nov 29 01:51:15 np0005539504 podman[216027]: 2025-11-29 06:51:15.721961402 +0000 UTC m=+0.064264990 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:51:15 np0005539504 podman[216028]: 2025-11-29 06:51:15.750516995 +0000 UTC m=+0.087592861 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Nov 29 01:51:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:51:16.155 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:51:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:51:16.156 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:51:16 np0005539504 nova_compute[187152]: 2025-11-29 06:51:16.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:51:16.164 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.813 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.814 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.814 187156 INFO nova.compute.manager [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Unshelving#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.945 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.946 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.954 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:17 np0005539504 nova_compute[187152]: 2025-11-29 06:51:17.976 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.006 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.007 187156 INFO nova.compute.claims [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.168 187156 DEBUG nova.compute.provider_tree [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.186 187156 DEBUG nova.scheduler.client.report [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.216 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.401 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.401 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquired lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.402 187156 DEBUG nova.network.neutron [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.623 187156 DEBUG nova.network.neutron [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:18 np0005539504 nova_compute[187152]: 2025-11-29 06:51:18.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.322 187156 DEBUG nova.network.neutron [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.339 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Releasing lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.341 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.341 187156 INFO nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating image(s)#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.342 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.342 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.343 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.343 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.355 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.356 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:19 np0005539504 nova_compute[187152]: 2025-11-29 06:51:19.737 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:21 np0005539504 nova_compute[187152]: 2025-11-29 06:51:21.914 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:21 np0005539504 nova_compute[187152]: 2025-11-29 06:51:21.997 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.part --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:21 np0005539504 nova_compute[187152]: 2025-11-29 06:51:21.998 187156 DEBUG nova.virt.images [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] 7c1a1714-9f7e-48f5-a5a0-571ca9c7d659 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:51:22 np0005539504 nova_compute[187152]: 2025-11-29 06:51:21.999 187156 DEBUG nova.privsep.utils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:51:22 np0005539504 nova_compute[187152]: 2025-11-29 06:51:22.000 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.part /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:51:22.906 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:51:22.906 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:51:22.906 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.236 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.part /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.converted" returned: 0 in 1.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.248 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.305 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3.converted --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.307 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.338 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.403 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.405 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.406 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.429 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.452 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399068.428726, 0b850e95-2727-4d2f-afa1-7a755670a387 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.453 187156 INFO nova.compute.manager [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.509 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.511 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3,backing_fmt=raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.530 187156 DEBUG nova.compute.manager [None req-18f988df-9fc6-4ed7-9f2f-15a47c7e013c - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.581 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3,backing_fmt=raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk 1073741824" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.582 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.583 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.647 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.650 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'migration_context' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.685 187156 INFO nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Rebasing disk image.#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.687 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.748 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.749 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:23 np0005539504 nova_compute[187152]: 2025-11-29 06:51:23.980 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:24 np0005539504 nova_compute[187152]: 2025-11-29 06:51:24.812 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:25 np0005539504 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:51:25 np0005539504 podman[216103]: 2025-11-29 06:51:25.73440932 +0000 UTC m=+0.227613736 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:51:25 np0005539504 podman[216104]: 2025-11-29 06:51:25.77323586 +0000 UTC m=+0.214390518 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 01:51:28 np0005539504 nova_compute[187152]: 2025-11-29 06:51:28.982 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.427 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk" returned: 0 in 5.678s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.428 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.429 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Ensure instance console log exists: /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.429 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.430 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.430 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.432 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='9d86131dcb9f95800ee04995bcb37e54',container_format='bare',created_at=2025-11-29T06:50:52Z,direct_url=<?>,disk_format='qcow2',id=7c1a1714-9f7e-48f5-a5a0-571ca9c7d659,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-188422497-shelved',owner='2ad5553710d5496dafe785396586bef5',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-11-29T06:51:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.438 187156 WARNING nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.445 187156 DEBUG nova.virt.libvirt.host [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.447 187156 DEBUG nova.virt.libvirt.host [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.450 187156 DEBUG nova.virt.libvirt.host [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.450 187156 DEBUG nova.virt.libvirt.host [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.452 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.453 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='9d86131dcb9f95800ee04995bcb37e54',container_format='bare',created_at=2025-11-29T06:50:52Z,direct_url=<?>,disk_format='qcow2',id=7c1a1714-9f7e-48f5-a5a0-571ca9c7d659,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-188422497-shelved',owner='2ad5553710d5496dafe785396586bef5',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-11-29T06:51:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.453 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.454 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.454 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.454 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.455 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.455 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.455 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.456 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.456 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.456 187156 DEBUG nova.virt.hardware [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.457 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.549 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.568 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <uuid>0b850e95-2727-4d2f-afa1-7a755670a387</uuid>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <name>instance-00000010</name>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-188422497</nova:name>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:51:29</nova:creationTime>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:user uuid="9399c90511c44462b8092380bad3cfdc">tempest-UnshelveToHostMultiNodesTest-1888846715-project-member</nova:user>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:        <nova:project uuid="2ad5553710d5496dafe785396586bef5">tempest-UnshelveToHostMultiNodesTest-1888846715</nova:project>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="7c1a1714-9f7e-48f5-a5a0-571ca9c7d659"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <entry name="serial">0b850e95-2727-4d2f-afa1-7a755670a387</entry>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <entry name="uuid">0b850e95-2727-4d2f-afa1-7a755670a387</entry>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/console.log" append="off"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:51:29 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:51:29 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:51:29 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:51:29 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:51:29 np0005539504 podman[216157]: 2025-11-29 06:51:29.663228183 +0000 UTC m=+0.062630205 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.807 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.808 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.809 187156 INFO nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Using config drive#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.815 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:29 np0005539504 nova_compute[187152]: 2025-11-29 06:51:29.850 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:30 np0005539504 nova_compute[187152]: 2025-11-29 06:51:30.050 187156 DEBUG nova.objects.instance [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'keypairs' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:30 np0005539504 nova_compute[187152]: 2025-11-29 06:51:30.501 187156 INFO nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating config drive at /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config#033[00m
Nov 29 01:51:30 np0005539504 nova_compute[187152]: 2025-11-29 06:51:30.507 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkpcopupd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:30 np0005539504 nova_compute[187152]: 2025-11-29 06:51:30.635 187156 DEBUG oslo_concurrency.processutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkpcopupd" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:30 np0005539504 systemd-machined[153423]: New machine qemu-11-instance-00000010.
Nov 29 01:51:30 np0005539504 systemd[1]: Started Virtual Machine qemu-11-instance-00000010.
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.164 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399091.163178, 0b850e95-2727-4d2f-afa1-7a755670a387 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.166 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.168 187156 DEBUG nova.compute.manager [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.168 187156 DEBUG nova.virt.libvirt.driver [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.172 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance spawned successfully.#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.219 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.222 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.275 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.275 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399091.1652417, 0b850e95-2727-4d2f-afa1-7a755670a387 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.276 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Started (Lifecycle Event)#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.339 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.344 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:51:31 np0005539504 nova_compute[187152]: 2025-11-29 06:51:31.457 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:51:32 np0005539504 nova_compute[187152]: 2025-11-29 06:51:32.444 187156 DEBUG nova.compute.manager [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:32 np0005539504 nova_compute[187152]: 2025-11-29 06:51:32.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:32 np0005539504 nova_compute[187152]: 2025-11-29 06:51:32.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:51:33 np0005539504 nova_compute[187152]: 2025-11-29 06:51:33.270 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:51:33 np0005539504 nova_compute[187152]: 2025-11-29 06:51:33.312 187156 DEBUG oslo_concurrency.lockutils [None req-c9beaeb2-283d-44cd-8dd9-a7f441c8b45b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 15.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:33 np0005539504 nova_compute[187152]: 2025-11-29 06:51:33.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:34 np0005539504 nova_compute[187152]: 2025-11-29 06:51:34.816 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:35 np0005539504 podman[216205]: 2025-11-29 06:51:35.763666063 +0000 UTC m=+0.091796543 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:51:36 np0005539504 nova_compute[187152]: 2025-11-29 06:51:36.270 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:36 np0005539504 nova_compute[187152]: 2025-11-29 06:51:36.538 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:36 np0005539504 nova_compute[187152]: 2025-11-29 06:51:36.539 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:36 np0005539504 nova_compute[187152]: 2025-11-29 06:51:36.539 187156 INFO nova.compute.manager [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Shelving#033[00m
Nov 29 01:51:36 np0005539504 nova_compute[187152]: 2025-11-29 06:51:36.661 187156 DEBUG nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:51:37 np0005539504 nova_compute[187152]: 2025-11-29 06:51:37.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:37 np0005539504 nova_compute[187152]: 2025-11-29 06:51:37.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:37 np0005539504 nova_compute[187152]: 2025-11-29 06:51:37.980 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:38 np0005539504 nova_compute[187152]: 2025-11-29 06:51:38.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:38 np0005539504 nova_compute[187152]: 2025-11-29 06:51:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:38 np0005539504 nova_compute[187152]: 2025-11-29 06:51:38.986 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:39 np0005539504 nova_compute[187152]: 2025-11-29 06:51:39.825 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:39 np0005539504 nova_compute[187152]: 2025-11-29 06:51:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:39 np0005539504 nova_compute[187152]: 2025-11-29 06:51:39.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:39 np0005539504 nova_compute[187152]: 2025-11-29 06:51:39.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:39 np0005539504 nova_compute[187152]: 2025-11-29 06:51:39.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:39 np0005539504 nova_compute[187152]: 2025-11-29 06:51:39.961 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.060 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.148 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.149 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.213 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.392 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.394 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5684MB free_disk=73.21159744262695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.394 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.394 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.784 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 0b850e95-2727-4d2f-afa1-7a755670a387 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.784 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:51:40 np0005539504 nova_compute[187152]: 2025-11-29 06:51:40.785 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.058 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.077 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.103 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.103 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.104 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.947 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.948 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:51:41 np0005539504 nova_compute[187152]: 2025-11-29 06:51:41.948 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:51:42 np0005539504 podman[216235]: 2025-11-29 06:51:42.762000613 +0000 UTC m=+0.077823486 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:51:42 np0005539504 nova_compute[187152]: 2025-11-29 06:51:42.840 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:42 np0005539504 nova_compute[187152]: 2025-11-29 06:51:42.840 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:42 np0005539504 nova_compute[187152]: 2025-11-29 06:51:42.840 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:51:42 np0005539504 nova_compute[187152]: 2025-11-29 06:51:42.841 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.253 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.657 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.679 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.679 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.680 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.680 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.681 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.681 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.681 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:51:43 np0005539504 nova_compute[187152]: 2025-11-29 06:51:43.988 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:44 np0005539504 nova_compute[187152]: 2025-11-29 06:51:44.829 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:46 np0005539504 nova_compute[187152]: 2025-11-29 06:51:46.707 187156 DEBUG nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:51:46 np0005539504 podman[216264]: 2025-11-29 06:51:46.734028304 +0000 UTC m=+0.080177479 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:51:46 np0005539504 podman[216265]: 2025-11-29 06:51:46.74607504 +0000 UTC m=+0.086231083 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Nov 29 01:51:48 np0005539504 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 29 01:51:48 np0005539504 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000010.scope: Consumed 14.167s CPU time.
Nov 29 01:51:48 np0005539504 systemd-machined[153423]: Machine qemu-11-instance-00000010 terminated.
Nov 29 01:51:48 np0005539504 nova_compute[187152]: 2025-11-29 06:51:48.988 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:49 np0005539504 nova_compute[187152]: 2025-11-29 06:51:49.721 187156 INFO nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 01:51:49 np0005539504 nova_compute[187152]: 2025-11-29 06:51:49.729 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:51:49 np0005539504 nova_compute[187152]: 2025-11-29 06:51:49.730 187156 DEBUG nova.objects.instance [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:49 np0005539504 nova_compute[187152]: 2025-11-29 06:51:49.831 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:50 np0005539504 nova_compute[187152]: 2025-11-29 06:51:50.110 187156 INFO nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Beginning cold snapshot process#033[00m
Nov 29 01:51:50 np0005539504 nova_compute[187152]: 2025-11-29 06:51:50.450 187156 DEBUG nova.privsep.utils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:51:50 np0005539504 nova_compute[187152]: 2025-11-29 06:51:50.451 187156 DEBUG oslo_concurrency.processutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk /var/lib/nova/instances/snapshots/tmpvvaczdh6/f07002aa9bfa4f5fabf44cf2adca2b83 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:51 np0005539504 nova_compute[187152]: 2025-11-29 06:51:51.030 187156 DEBUG oslo_concurrency.processutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk /var/lib/nova/instances/snapshots/tmpvvaczdh6/f07002aa9bfa4f5fabf44cf2adca2b83" returned: 0 in 0.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:51 np0005539504 nova_compute[187152]: 2025-11-29 06:51:51.031 187156 INFO nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:51:53 np0005539504 nova_compute[187152]: 2025-11-29 06:51:53.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:54 np0005539504 nova_compute[187152]: 2025-11-29 06:51:54.877 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:55 np0005539504 nova_compute[187152]: 2025-11-29 06:51:55.651 187156 INFO nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Snapshot image upload complete#033[00m
Nov 29 01:51:55 np0005539504 nova_compute[187152]: 2025-11-29 06:51:55.651 187156 DEBUG nova.compute.manager [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.091 187156 INFO nova.compute.manager [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Shelve offloading#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.109 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.110 187156 DEBUG nova.compute.manager [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.113 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.113 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquired lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.113 187156 DEBUG nova.network.neutron [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:51:56 np0005539504 nova_compute[187152]: 2025-11-29 06:51:56.445 187156 DEBUG nova.network.neutron [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:56 np0005539504 podman[216324]: 2025-11-29 06:51:56.710228213 +0000 UTC m=+0.050624251 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:51:56 np0005539504 podman[216325]: 2025-11-29 06:51:56.748119167 +0000 UTC m=+0.090887119 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.394 187156 DEBUG nova.network.neutron [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.414 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Releasing lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.421 187156 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.422 187156 DEBUG nova.objects.instance [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'resources' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.433 187156 INFO nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deleting instance files /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387_del#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.439 187156 INFO nova.virt.libvirt.driver [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deletion of /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387_del complete#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.595 187156 INFO nova.scheduler.client.report [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Deleted allocations for instance 0b850e95-2727-4d2f-afa1-7a755670a387#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.691 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.692 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.733 187156 DEBUG nova.compute.provider_tree [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.752 187156 DEBUG nova.scheduler.client.report [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.777 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:57 np0005539504 nova_compute[187152]: 2025-11-29 06:51:57.890 187156 DEBUG oslo_concurrency.lockutils [None req-cf9bbb93-6e64-4721-96d1-29f2e5a324cb 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 21.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:58 np0005539504 nova_compute[187152]: 2025-11-29 06:51:58.992 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:59 np0005539504 nova_compute[187152]: 2025-11-29 06:51:59.880 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:00 np0005539504 podman[216372]: 2025-11-29 06:52:00.748225148 +0000 UTC m=+0.092150392 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:52:03 np0005539504 nova_compute[187152]: 2025-11-29 06:52:03.994 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:04 np0005539504 nova_compute[187152]: 2025-11-29 06:52:04.169 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399109.1678367, 0b850e95-2727-4d2f-afa1-7a755670a387 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:04 np0005539504 nova_compute[187152]: 2025-11-29 06:52:04.170 187156 INFO nova.compute.manager [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:52:04 np0005539504 nova_compute[187152]: 2025-11-29 06:52:04.202 187156 DEBUG nova.compute.manager [None req-8bc54ea7-70a1-4871-b856-3f6e759c67a0 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:04 np0005539504 nova_compute[187152]: 2025-11-29 06:52:04.882 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:06 np0005539504 podman[216393]: 2025-11-29 06:52:06.732008401 +0000 UTC m=+0.080577879 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:52:08 np0005539504 nova_compute[187152]: 2025-11-29 06:52:08.996 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:09 np0005539504 nova_compute[187152]: 2025-11-29 06:52:09.884 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:13 np0005539504 podman[216416]: 2025-11-29 06:52:13.710585597 +0000 UTC m=+0.057306262 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:52:14 np0005539504 nova_compute[187152]: 2025-11-29 06:52:14.046 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:14 np0005539504 nova_compute[187152]: 2025-11-29 06:52:14.886 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:17.483 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:52:17 np0005539504 nova_compute[187152]: 2025-11-29 06:52:17.483 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:17.484 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:52:17 np0005539504 podman[216435]: 2025-11-29 06:52:17.732252551 +0000 UTC m=+0.079099962 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:52:17 np0005539504 podman[216436]: 2025-11-29 06:52:17.73997076 +0000 UTC m=+0.083966333 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 01:52:18 np0005539504 nova_compute[187152]: 2025-11-29 06:52:18.085 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:18.486 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:19 np0005539504 nova_compute[187152]: 2025-11-29 06:52:19.049 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:19 np0005539504 nova_compute[187152]: 2025-11-29 06:52:19.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:22.907 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:22.908 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:22.908 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:24 np0005539504 nova_compute[187152]: 2025-11-29 06:52:24.053 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:24 np0005539504 nova_compute[187152]: 2025-11-29 06:52:24.941 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:27 np0005539504 podman[216480]: 2025-11-29 06:52:27.712023966 +0000 UTC m=+0.060849587 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:52:27 np0005539504 podman[216481]: 2025-11-29 06:52:27.772344317 +0000 UTC m=+0.115692469 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:52:29 np0005539504 nova_compute[187152]: 2025-11-29 06:52:29.055 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:29 np0005539504 nova_compute[187152]: 2025-11-29 06:52:29.983 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:31 np0005539504 podman[216529]: 2025-11-29 06:52:31.712292261 +0000 UTC m=+0.052528591 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:52:34 np0005539504 nova_compute[187152]: 2025-11-29 06:52:34.057 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:34 np0005539504 nova_compute[187152]: 2025-11-29 06:52:34.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:35 np0005539504 nova_compute[187152]: 2025-11-29 06:52:35.953 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:37 np0005539504 podman[216549]: 2025-11-29 06:52:37.721262667 +0000 UTC m=+0.061273419 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 29 01:52:38 np0005539504 nova_compute[187152]: 2025-11-29 06:52:38.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:38 np0005539504 nova_compute[187152]: 2025-11-29 06:52:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:38 np0005539504 nova_compute[187152]: 2025-11-29 06:52:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.059 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.960 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.961 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:52:39 np0005539504 nova_compute[187152]: 2025-11-29 06:52:39.987 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.141 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.142 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5782MB free_disk=73.24026870727539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.143 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.143 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.202 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.203 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.223 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.238 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.259 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:52:40 np0005539504 nova_compute[187152]: 2025-11-29 06:52:40.260 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.813 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.814 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.857 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.951 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.952 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.961 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:52:41 np0005539504 nova_compute[187152]: 2025-11-29 06:52:41.962 187156 INFO nova.compute.claims [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.113 187156 DEBUG nova.compute.provider_tree [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.126 187156 DEBUG nova.scheduler.client.report [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.146 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.146 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.217 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.217 187156 DEBUG nova.network.neutron [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.238 187156 INFO nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.259 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.395 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.398 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.398 187156 INFO nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Creating image(s)#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.400 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "/var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.400 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "/var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.401 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "/var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.427 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.492 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.494 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.495 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.511 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.531 187156 DEBUG nova.policy [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.577 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.578 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.619 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.620 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.620 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.704 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.705 187156 DEBUG nova.virt.disk.api [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Checking if we can resize image /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.705 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.763 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.764 187156 DEBUG nova.virt.disk.api [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Cannot resize image /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.765 187156 DEBUG nova.objects.instance [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 95b3301b-b068-4bbe-bc2b-1e83aba9eadf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.781 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.781 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Ensure instance console log exists: /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.782 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.782 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:42 np0005539504 nova_compute[187152]: 2025-11-29 06:52:42.783 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.261 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.261 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.262 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.280 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.281 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.281 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.282 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:52:43 np0005539504 nova_compute[187152]: 2025-11-29 06:52:43.465 187156 DEBUG nova.network.neutron [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Successfully created port: 981dbb53-0eef-491c-af47-bdee2dc23b92 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.061 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:44 np0005539504 podman[216587]: 2025-11-29 06:52:44.720583392 +0000 UTC m=+0.059994594 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.757 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.757 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.771 187156 DEBUG nova.network.neutron [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Successfully updated port: 981dbb53-0eef-491c-af47-bdee2dc23b92 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.781 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.784 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.784 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquired lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.785 187156 DEBUG nova.network.neutron [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.889 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.890 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.900 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.900 187156 INFO nova.compute.claims [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:44 np0005539504 nova_compute[187152]: 2025-11-29 06:52:44.989 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.064 187156 DEBUG nova.compute.provider_tree [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.079 187156 DEBUG nova.scheduler.client.report [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.117 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.117 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.186 187156 DEBUG nova.network.neutron [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.195 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.195 187156 DEBUG nova.network.neutron [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.226 187156 INFO nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.249 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.397 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.399 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.399 187156 INFO nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating image(s)#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.400 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.401 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.402 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.431 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.498 187156 DEBUG nova.policy [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea965b54cc694db4abef98ad9973e9f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.502 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.503 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.503 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.518 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.577 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.579 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.621 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.623 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.624 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.694 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.695 187156 DEBUG nova.virt.disk.api [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Checking if we can resize image /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.696 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.764 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.766 187156 DEBUG nova.virt.disk.api [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Cannot resize image /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.767 187156 DEBUG nova.objects.instance [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.782 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.783 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Ensure instance console log exists: /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.784 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.784 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.785 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.790 187156 DEBUG nova.compute.manager [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-changed-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.790 187156 DEBUG nova.compute.manager [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Refreshing instance network info cache due to event network-changed-981dbb53-0eef-491c-af47-bdee2dc23b92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:52:45 np0005539504 nova_compute[187152]: 2025-11-29 06:52:45.791 187156 DEBUG oslo_concurrency.lockutils [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.151 187156 DEBUG nova.network.neutron [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Successfully created port: 1ff22547-5892-4360-8abe-429ea2f212ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.226 187156 DEBUG nova.network.neutron [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updating instance_info_cache with network_info: [{"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.252 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Releasing lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.253 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Instance network_info: |[{"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.253 187156 DEBUG oslo_concurrency.lockutils [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.254 187156 DEBUG nova.network.neutron [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Refreshing network info cache for port 981dbb53-0eef-491c-af47-bdee2dc23b92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.260 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Start _get_guest_xml network_info=[{"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.267 187156 WARNING nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.273 187156 DEBUG nova.virt.libvirt.host [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.274 187156 DEBUG nova.virt.libvirt.host [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.285 187156 DEBUG nova.virt.libvirt.host [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.286 187156 DEBUG nova.virt.libvirt.host [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.288 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.288 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.289 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.289 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.290 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.290 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.291 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.291 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.292 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.292 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.292 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.293 187156 DEBUG nova.virt.hardware [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.300 187156 DEBUG nova.virt.libvirt.vif [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-36479523',display_name='tempest-FloatingIPsAssociationTestJSON-server-36479523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-36479523',id=22,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71af3e88884e42c48fb244d7d6ca31e2',ramdisk_id='',reservation_id='r-ikud3p2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-940149563',owner_user_name='tempest-FloatingIPsAssociationTestJSON-940149563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:42Z,user_data=None,user_id='a0fcd4f4de7e4072be30f7e3d4ac7c77',uuid=95b3301b-b068-4bbe-bc2b-1e83aba9eadf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.300 187156 DEBUG nova.network.os_vif_util [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converting VIF {"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.302 187156 DEBUG nova.network.os_vif_util [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.304 187156 DEBUG nova.objects.instance [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 95b3301b-b068-4bbe-bc2b-1e83aba9eadf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.317 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <uuid>95b3301b-b068-4bbe-bc2b-1e83aba9eadf</uuid>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <name>instance-00000016</name>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-36479523</nova:name>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:52:46</nova:creationTime>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:user uuid="a0fcd4f4de7e4072be30f7e3d4ac7c77">tempest-FloatingIPsAssociationTestJSON-940149563-project-member</nova:user>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:project uuid="71af3e88884e42c48fb244d7d6ca31e2">tempest-FloatingIPsAssociationTestJSON-940149563</nova:project>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        <nova:port uuid="981dbb53-0eef-491c-af47-bdee2dc23b92">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <entry name="serial">95b3301b-b068-4bbe-bc2b-1e83aba9eadf</entry>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <entry name="uuid">95b3301b-b068-4bbe-bc2b-1e83aba9eadf</entry>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.config"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:a5:c3:e8"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <target dev="tap981dbb53-0e"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/console.log" append="off"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:52:46 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:52:46 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:52:46 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:52:46 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.319 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Preparing to wait for external event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.320 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.320 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.321 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.322 187156 DEBUG nova.virt.libvirt.vif [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-36479523',display_name='tempest-FloatingIPsAssociationTestJSON-server-36479523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-36479523',id=22,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71af3e88884e42c48fb244d7d6ca31e2',ramdisk_id='',reservation_id='r-ikud3p2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-940149563',owner_user_name='tempest-FloatingIPsAssociationTestJSON-940149563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:42Z,user_data=None,user_id='a0fcd4f4de7e4072be30f7e3d4ac7c77',uuid=95b3301b-b068-4bbe-bc2b-1e83aba9eadf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.322 187156 DEBUG nova.network.os_vif_util [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converting VIF {"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.323 187156 DEBUG nova.network.os_vif_util [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.324 187156 DEBUG os_vif [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.325 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.325 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.326 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.333 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.333 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap981dbb53-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.334 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap981dbb53-0e, col_values=(('external_ids', {'iface-id': '981dbb53-0eef-491c-af47-bdee2dc23b92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:c3:e8', 'vm-uuid': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.336 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.337 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:52:46 np0005539504 NetworkManager[55210]: <info>  [1764399166.3374] manager: (tap981dbb53-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.346 187156 INFO os_vif [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e')#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.409 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.410 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.410 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] No VIF found with MAC fa:16:3e:a5:c3:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.411 187156 INFO nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Using config drive#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.915 187156 DEBUG nova.network.neutron [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Successfully updated port: 1ff22547-5892-4360-8abe-429ea2f212ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.936 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.936 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:46 np0005539504 nova_compute[187152]: 2025-11-29 06:52:46.936 187156 DEBUG nova.network.neutron [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.011 187156 INFO nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Creating config drive at /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.config#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.017 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm1qh03l8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.149 187156 DEBUG oslo_concurrency.processutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm1qh03l8" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.171 187156 DEBUG nova.network.neutron [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:52:47 np0005539504 kernel: tap981dbb53-0e: entered promiscuous mode
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.229 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:47 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:47Z|00056|binding|INFO|Claiming lport 981dbb53-0eef-491c-af47-bdee2dc23b92 for this chassis.
Nov 29 01:52:47 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:47Z|00057|binding|INFO|981dbb53-0eef-491c-af47-bdee2dc23b92: Claiming fa:16:3e:a5:c3:e8 10.100.0.6
Nov 29 01:52:47 np0005539504 NetworkManager[55210]: <info>  [1764399167.2328] manager: (tap981dbb53-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.247 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:c3:e8 10.100.0.6'], port_security=['fa:16:3e:a5:c3:e8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f562e81-d2bf-4e2c-b0ea-0aa5dfe52d68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86b613b3-d246-4b07-a5b7-9ab1b7da74dc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=981dbb53-0eef-491c-af47-bdee2dc23b92) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.250 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 981dbb53-0eef-491c-af47-bdee2dc23b92 in datapath 3c63c551-2e9f-4b47-9e49-c73140efe20a bound to our chassis#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.253 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c63c551-2e9f-4b47-9e49-c73140efe20a#033[00m
Nov 29 01:52:47 np0005539504 systemd-udevd[216642]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.270 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ba7797-5922-4d6a-ab22-27c51d586024]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.272 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c63c551-21 in ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.276 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c63c551-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.277 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2f990ae8-1ee1-49ae-aa21-babbe30193f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.277 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b63159f4-76d4-4965-8a84-9745bfc9f8c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 NetworkManager[55210]: <info>  [1764399167.2964] device (tap981dbb53-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:52:47 np0005539504 systemd-machined[153423]: New machine qemu-12-instance-00000016.
Nov 29 01:52:47 np0005539504 NetworkManager[55210]: <info>  [1764399167.2974] device (tap981dbb53-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.295 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb12469-08e6-4ba6-b6f4-e1e2e34cfe13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.305 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:47 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:47Z|00058|binding|INFO|Setting lport 981dbb53-0eef-491c-af47-bdee2dc23b92 ovn-installed in OVS
Nov 29 01:52:47 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:47Z|00059|binding|INFO|Setting lport 981dbb53-0eef-491c-af47-bdee2dc23b92 up in Southbound
Nov 29 01:52:47 np0005539504 systemd[1]: Started Virtual Machine qemu-12-instance-00000016.
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.325 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dc15d95f-5ebb-44de-b2fe-07013f1cb816]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.365 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f9dfa9a4-4412-4231-8b69-119731c89de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.373 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[432e5460-67c6-4fb3-9efc-df5247c591c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 NetworkManager[55210]: <info>  [1764399167.3738] manager: (tap3c63c551-20): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.412 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[91bae18e-18f6-4972-8dbb-ade96ee01cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.415 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b6056ba7-6f86-443c-a6b4-aa1e672af991]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 NetworkManager[55210]: <info>  [1764399167.4395] device (tap3c63c551-20): carrier: link connected
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.446 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[223608e7-c373-406c-b51d-b345913bd834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.468 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eacdc6af-a546-4461-a1fe-a35e1b143780]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c63c551-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:20:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463031, 'reachable_time': 39048, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216675, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.484 187156 DEBUG nova.compute.manager [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-changed-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.484 187156 DEBUG nova.compute.manager [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Refreshing instance network info cache due to event network-changed-1ff22547-5892-4360-8abe-429ea2f212ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.485 187156 DEBUG oslo_concurrency.lockutils [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.490 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[41160092-7f94-43fb-95ee-9b2f9fb0933a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:20eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463031, 'tstamp': 463031}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216676, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.515 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8f71992f-07e5-413e-9c4a-7e345990a4b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c63c551-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:20:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463031, 'reachable_time': 39048, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216677, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.553 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[49e72e68-34bc-4cf7-b09b-5acc2be7923e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.618 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8f24e716-fad4-4759-a4c5-a26ccc5a96ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.621 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c63c551-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.622 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.622 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c63c551-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:47 np0005539504 NetworkManager[55210]: <info>  [1764399167.6258] manager: (tap3c63c551-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 29 01:52:47 np0005539504 kernel: tap3c63c551-20: entered promiscuous mode
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.630 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c63c551-20, col_values=(('external_ids', {'iface-id': '90a33ad8-e32a-4cc0-85e0-1ed390ab00fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.624 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:47 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:47Z|00060|binding|INFO|Releasing lport 90a33ad8-e32a-4cc0-85e0-1ed390ab00fa from this chassis (sb_readonly=0)
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.635 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.636 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c63c551-2e9f-4b47-9e49-c73140efe20a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c63c551-2e9f-4b47-9e49-c73140efe20a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.638 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad8d5c1-d1d6-4066-a7e1-f75736aef84d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.639 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-3c63c551-2e9f-4b47-9e49-c73140efe20a
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/3c63c551-2e9f-4b47-9e49-c73140efe20a.pid.haproxy
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 3c63c551-2e9f-4b47-9e49-c73140efe20a
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:52:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:47.640 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'env', 'PROCESS_TAG=haproxy-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c63c551-2e9f-4b47-9e49-c73140efe20a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.713 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399167.712068, 95b3301b-b068-4bbe-bc2b-1e83aba9eadf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.713 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] VM Started (Lifecycle Event)#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.718 187156 DEBUG nova.network.neutron [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updated VIF entry in instance network info cache for port 981dbb53-0eef-491c-af47-bdee2dc23b92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.719 187156 DEBUG nova.network.neutron [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updating instance_info_cache with network_info: [{"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.737 187156 DEBUG oslo_concurrency.lockutils [req-484ea782-752b-4a87-81f1-f95acd87ddbc req-01073bfc-40f6-470b-ac2e-1d577055fe14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.745 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.753 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399167.7122145, 95b3301b-b068-4bbe-bc2b-1e83aba9eadf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.754 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.770 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.774 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.793 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.884 187156 DEBUG nova.network.neutron [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.911 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.912 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance network_info: |[{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.914 187156 DEBUG oslo_concurrency.lockutils [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.914 187156 DEBUG nova.network.neutron [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Refreshing network info cache for port 1ff22547-5892-4360-8abe-429ea2f212ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.920 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Start _get_guest_xml network_info=[{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.922 187156 DEBUG nova.compute.manager [req-bf2eb06f-a660-418e-b3d9-1faa7d73b7a3 req-b0f830de-032d-45a1-a48a-c52726719552 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.922 187156 DEBUG oslo_concurrency.lockutils [req-bf2eb06f-a660-418e-b3d9-1faa7d73b7a3 req-b0f830de-032d-45a1-a48a-c52726719552 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.923 187156 DEBUG oslo_concurrency.lockutils [req-bf2eb06f-a660-418e-b3d9-1faa7d73b7a3 req-b0f830de-032d-45a1-a48a-c52726719552 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.923 187156 DEBUG oslo_concurrency.lockutils [req-bf2eb06f-a660-418e-b3d9-1faa7d73b7a3 req-b0f830de-032d-45a1-a48a-c52726719552 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.924 187156 DEBUG nova.compute.manager [req-bf2eb06f-a660-418e-b3d9-1faa7d73b7a3 req-b0f830de-032d-45a1-a48a-c52726719552 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Processing event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.925 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.930 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.932 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399167.930086, 95b3301b-b068-4bbe-bc2b-1e83aba9eadf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.932 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.935 187156 WARNING nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.940 187156 INFO nova.virt.libvirt.driver [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Instance spawned successfully.#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.941 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.949 187156 DEBUG nova.virt.libvirt.host [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.953 187156 DEBUG nova.virt.libvirt.host [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:47.960 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000016', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71af3e88884e42c48fb244d7d6ca31e2', 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'hostId': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:47.962 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.965 187156 DEBUG nova.virt.libvirt.host [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.967 187156 DEBUG nova.virt.libvirt.host [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.968 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.969 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.969 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.969 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.970 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.970 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.970 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.970 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.971 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.971 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.971 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.971 187156 DEBUG nova.virt.hardware [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.975 187156 DEBUG nova.virt.libvirt.vif [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:45Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.975 187156 DEBUG nova.network.os_vif_util [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.976 187156 DEBUG nova.network.os_vif_util [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.977 187156 DEBUG nova.objects.instance [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.982 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.990 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:47.994 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.994 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <uuid>2e380200-8276-4470-965f-31baa0bfd760</uuid>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <name>instance-00000017</name>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1351543550</nova:name>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:52:47</nova:creationTime>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:user uuid="ea965b54cc694db4abef98ad9973e9f2">tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member</nova:user>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:project uuid="93dcd8ffe78147b69c244e2e3bfc2121">tempest-LiveAutoBlockMigrationV225Test-1343206834</nova:project>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        <nova:port uuid="1ff22547-5892-4360-8abe-429ea2f212ee">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <entry name="serial">2e380200-8276-4470-965f-31baa0bfd760</entry>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <entry name="uuid">2e380200-8276-4470-965f-31baa0bfd760</entry>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:56:a7:a1"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <target dev="tap1ff22547-58"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/console.log" append="off"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:52:47 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:52:47 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:52:47 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:52:47 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:47.996 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.996 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Preparing to wait for external event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.998 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.998 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.998 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:47 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.999 187156 DEBUG nova.virt.libvirt.vif [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:45Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:47.999 187156 DEBUG nova.network.os_vif_util [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.000 187156 DEBUG nova.network.os_vif_util [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.000 187156 DEBUG os_vif [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.001 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.001 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.002 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77af1a54-f9d0-4fd0-82db-61a37ec15628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:47.962660', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04024154-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '7426992d425e2c2e4679a150991080406278da1b044419d0fc0929595db3e07d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:47.962660', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040264c2-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '4007f0b2f2ec670793802ea8dce21f8a89e8d36f5f85d179e806e5ec10e76b04'}]}, 'timestamp': '2025-11-29 06:52:47.997056', '_unique_id': '3404cc6487a343a58195807ab97166a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.001 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.004 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.005 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.005 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.006 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.006 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.006 187156 DEBUG nova.virt.libvirt.driver [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.007 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 95b3301b-b068-4bbe-bc2b-1e83aba9eadf / tap981dbb53-0e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.007 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2cb1bd7-7bf9-41c6-a0d7-a18c3a4eecca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.003824', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '04040a3e-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '2f0ef8b27e0c03d9fceb2187120c36a51b4d154ccb8679f2c29eec8aa1a53b63'}]}, 'timestamp': '2025-11-29 06:52:48.007816', '_unique_id': '42ecd987df2a4b2db7ca8bc16ceda9ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.008 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.009 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.010 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ff22547-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.011 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ff22547-58, col_values=(('external_ids', {'iface-id': '1ff22547-5892-4360-8abe-429ea2f212ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:a7:a1', 'vm-uuid': '2e380200-8276-4470-965f-31baa0bfd760'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ddffd9a-5d90-46d2-91f0-2fcb5f7a2f77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.009916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '04046934-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '312d8b77c5417bc2ab33865c313793579328e9ff543e3875ec23dfaea79b5775'}]}, 'timestamp': '2025-11-29 06:52:48.010188', '_unique_id': '3e247a9bffa1428ab4d2c67daa2e4099'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.011 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.011 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.011 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>]
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.012 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.013 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:48 np0005539504 NetworkManager[55210]: <info>  [1764399168.0144] manager: (tap1ff22547-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.014 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f29f013b-69d8-40a3-9c3b-94fa0bbc43ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.012237', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '0404cbfe-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': 'b664560a3136f62d2ad9fb2ff4c56801a8f7d760cfd04f0d93d8fee16d6e5ab7'}]}, 'timestamp': '2025-11-29 06:52:48.012729', '_unique_id': 'e789c8c27fd04ef0b564f098ca08ed23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.022 187156 INFO os_vif [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58')#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.049 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.050 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.050 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c267f10-eb77-41eb-af94-89dd8686f031', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.016299', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '040a88b4-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.951057989, 'message_signature': '6d19de6bdea151e71ca4289cc97fced927a617018ba35dbd9e1d36a4c2c829e2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.016299', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040a9ac0-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.951057989, 'message_signature': 'e8bd13257106f75aa932e69657ad6aecfbbacd11caef398e326ca940af7a793a'}]}, 'timestamp': '2025-11-29 06:52:48.050823', '_unique_id': 'ba9c7e319d494672967cf270764ef1a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.053 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32cfc510-2ba6-4309-83aa-acabc13946f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.053655', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '040b1720-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '995df128aedd9421fbc04ab4ef97198c1d961f4ae916fbe84e8d82935987505d'}]}, 'timestamp': '2025-11-29 06:52:48.053979', '_unique_id': 'f6b0f1ec4755496ca61168b34fdfc042'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.082 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/cpu volume: 110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bebf8697-94d6-4717-938c-7e5c81d3fa4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110000000, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'timestamp': '2025-11-29T06:52:48.055351', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '040f9b60-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4631.017225678, 'message_signature': '761db750b2deb2c0bf216acd6af546d5bb0dd03233adfebe8e69c83a7fa8785a'}]}, 'timestamp': '2025-11-29 06:52:48.083717', '_unique_id': 'd01474b13f9847a3afe266b14d4f34b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.086 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.086 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5789437b-d172-4ca7-8e7b-5d12ec3b0943', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.086175', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04100d16-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '4f69ccfe8a92e6f1a3cfee32cc91923522e9f83cb058c80e7e1814cead7e2cd2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.086175', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04101a4a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '084ccbed89b0c91bac518863cdbec5fb05789be02d5e61889a65c24f93c04fed'}]}, 'timestamp': '2025-11-29 06:52:48.086822', '_unique_id': 'fc1e6043c5dc4bd3bd59aa98f2f94d36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.088 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.088 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>]
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.088 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '439ff717-453e-43fc-b78a-cd5a84d26682', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.088868', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '041076b6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': 'a2475d07ea3c0fb3a0ff1fe1ec7e703e479d283f90f93946b1d7d02fef1f2c08'}]}, 'timestamp': '2025-11-29 06:52:48.089228', '_unique_id': 'a735acae88ec4360b7dda65b7c6817b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.090 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.090 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c61699f1-b57b-465b-b165-664b3e15bc2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.090547', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0410b5c2-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '9d5a2271a8cd4ed27f85536047fdcbd4a2852937ae9dff1a20eaabe488757b4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.090547', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0410be64-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '22c5aba2b356980c348800ba710a0a5e64649c09ea325829d4a40e42ac80fc80'}]}, 'timestamp': '2025-11-29 06:52:48.090988', '_unique_id': '23dd2746358a4db99954b5d31e0afdfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.092 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.092 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2ce9fd-d91c-47eb-b0d1-fa4d0356be02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.092294', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0410fb54-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.951057989, 'message_signature': '66891c2dbf56392715bd8e9f6f2eb1946dec4ff8cd8711c26a8364bd5a41e168'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.092294', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04110766-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.951057989, 'message_signature': 'a0e2a2417b80afeecea3a3f9c5b94f032ac57676a46d0c05cf19dbe2424ef8b4'}]}, 'timestamp': '2025-11-29 06:52:48.092876', '_unique_id': '076bda03f3e146dbb9b0e46297416f69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.094 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3239726f-33ef-43fd-b66a-d3cf93df1bd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.094316', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '04114bae-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '88bf611a0928def7967bfd0197633f8dd82389221f79ce10302f0af8153e6a55'}]}, 'timestamp': '2025-11-29 06:52:48.094658', '_unique_id': '815d7f1ead0a4181903e444587abd335'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.096 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.096 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '455ce3f4-6713-49ef-8540-25eea08edb84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.096009', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04118e16-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '8e56daeb7d5ffd2f4dd47700816b809f4fb086d4afa7fe29b6f69eaac12df186'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.096009', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '041199a6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': 'f25f9083287862a7f4c78c0986fa980001bfcc512c500ecf001b23cb4f3e3e08'}]}, 'timestamp': '2025-11-29 06:52:48.096642', '_unique_id': '8e18054c4e454c9d8df152e03e75c680'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.097 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b685e5f-252d-4e7b-9dc0-86cdcafddd5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.097854', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0411d326-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': 'b1f118a8783caff46a12324f9e93f11ce9026f4513ae2c2c863e8ebb75717f5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.097854', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0411db1e-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '1fc1727d32ea05155dde92c7d79fdb234f27163ba782623deacb8089ca0dae74'}]}, 'timestamp': '2025-11-29 06:52:48.098275', '_unique_id': '134c4e961705414997d3cd75d7b95ca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.100 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.099 187156 INFO nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Took 5.70 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.100 187156 DEBUG nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05493419-f0ca-4bfb-998a-c87dddaa3cbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.100003', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '04122736-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '0c8704cf7bfcfd0df1bb66805c041b0e87e63f827c8d0cff43d6091b40f19391'}]}, 'timestamp': '2025-11-29 06:52:48.100240', '_unique_id': 'af332ea43814420e92326e883d799218'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.102 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.102 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>]
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.102 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.102 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0dbf3c9-9343-4c32-baea-851d8f75370f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.102625', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04128db6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.951057989, 'message_signature': 'c24e226d964cb42a81c90330ac7d2e3c09d10d6f917d241a168cde8aecb25f7a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.102625', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04129626-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.951057989, 'message_signature': '966e52e2763cc409e9d998271b5e97ac99950886ba3609dc5737d7bb045d3902'}]}, 'timestamp': '2025-11-29 06:52:48.103081', '_unique_id': '5aa3ee9ae8d14b9ab45de66f3c9dd8cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.104 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd287feb7-5afa-4e8b-afef-077a3e249b0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.104400', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '0412d316-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '0428289b5788bd67de8f37b2381e068e0bfe4c16d5c89dde27cdd1a41924f410'}]}, 'timestamp': '2025-11-29 06:52:48.104656', '_unique_id': '85e40153114740dfb783b0f657f57240'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30042a8e-4245-4043-a6b5-e08b007f4220', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.105993', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '04131286-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': '148fbea5e581b4244110f65d96d94e20979e703469969612219df81ab28d09f3'}]}, 'timestamp': '2025-11-29 06:52:48.106303', '_unique_id': 'cabf1f6697b8406cbff7cec59a3c458c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.107 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.107 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 95b3301b-b068-4bbe-bc2b-1e83aba9eadf: ceilometer.compute.pollsters.NoVolumeException
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.108 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4dfe331-a3f9-4e14-b7fb-748785814160', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000016-95b3301b-b068-4bbe-bc2b-1e83aba9eadf-tap981dbb53-0e', 'timestamp': '2025-11-29T06:52:48.108168', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'tap981dbb53-0e', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a5:c3:e8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap981dbb53-0e'}, 'message_id': '04136aba-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.938574131, 'message_signature': 'a6b2eebf1b2c12001306097507afb40d572fd97f0101875ed79d0ff06306dd84'}]}, 'timestamp': '2025-11-29 06:52:48.108600', '_unique_id': '3346b93b5f17468b879d36158fb29946'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-36479523>]
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.110 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.111 12 DEBUG ceilometer.compute.pollsters [-] 95b3301b-b068-4bbe-bc2b-1e83aba9eadf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19277e17-20c2-414d-9086-a76ae102f6e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-vda', 'timestamp': '2025-11-29T06:52:48.110947', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0413d496-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '17bdad789a0bb2add1674d6048bef5ab9442ae9397f689b819d66624dd265e3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf-sda', 'timestamp': '2025-11-29T06:52:48.110947', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-36479523', 'name': 'instance-00000016', 'instance_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'instance_type': 'm1.nano', 'host': '6701d62cd0452073d789921984bacb4c64b923dbc23f993b382f3f41', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0413e166-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4630.897387208, 'message_signature': '63530ba60f8772c51d61b7b6eb148970c8561f82181d1047c50164c7b60320d4'}]}, 'timestamp': '2025-11-29 06:52:48.111572', '_unique_id': 'abf6e8abea3a4f4799eebf87b8d29f66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:52:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539504 podman[216716]: 2025-11-29 06:52:48.065100085 +0000 UTC m=+0.050069925 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:52:48 np0005539504 podman[216723]: 2025-11-29 06:52:48.631311136 +0000 UTC m=+0.577188569 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:52:48 np0005539504 podman[216716]: 2025-11-29 06:52:48.699809409 +0000 UTC m=+0.684779239 container create 4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 01:52:48 np0005539504 podman[216727]: 2025-11-29 06:52:48.767696765 +0000 UTC m=+0.707215176 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 01:52:48 np0005539504 systemd[1]: Started libpod-conmon-4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4.scope.
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.813 187156 INFO nova.compute.manager [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Took 6.89 seconds to build instance.#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.822 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.823 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.823 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] No VIF found with MAC fa:16:3e:56:a7:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.824 187156 INFO nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Using config drive#033[00m
Nov 29 01:52:48 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:52:48 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df5a81801d44020f95ecd136ee15bafa652bef7d555ea68ff0afc2c27f70fe4b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:52:48 np0005539504 nova_compute[187152]: 2025-11-29 06:52:48.846 187156 DEBUG oslo_concurrency.lockutils [None req-123ddfa3-7607-448b-95c9-11b07c2a4a40 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:48 np0005539504 podman[216716]: 2025-11-29 06:52:48.875185621 +0000 UTC m=+0.860155431 container init 4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:52:48 np0005539504 podman[216716]: 2025-11-29 06:52:48.88218702 +0000 UTC m=+0.867156810 container start 4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:52:48 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [NOTICE]   (216781) : New worker (216783) forked
Nov 29 01:52:48 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [NOTICE]   (216781) : Loading success.
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.063 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.188 187156 INFO nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating config drive at /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.193 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr223lu3l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.329 187156 DEBUG oslo_concurrency.processutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr223lu3l" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:49 np0005539504 kernel: tap1ff22547-58: entered promiscuous mode
Nov 29 01:52:49 np0005539504 NetworkManager[55210]: <info>  [1764399169.4008] manager: (tap1ff22547-58): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 29 01:52:49 np0005539504 systemd-udevd[216667]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.403 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:49 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:49Z|00061|binding|INFO|Claiming lport 1ff22547-5892-4360-8abe-429ea2f212ee for this chassis.
Nov 29 01:52:49 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:49Z|00062|binding|INFO|1ff22547-5892-4360-8abe-429ea2f212ee: Claiming fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:52:49 np0005539504 NetworkManager[55210]: <info>  [1764399169.4208] device (tap1ff22547-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:52:49 np0005539504 NetworkManager[55210]: <info>  [1764399169.4224] device (tap1ff22547-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.438 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:a1 10.100.0.5'], port_security=['fa:16:3e:56:a7:a1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2e380200-8276-4470-965f-31baa0bfd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1ff22547-5892-4360-8abe-429ea2f212ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.440 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1ff22547-5892-4360-8abe-429ea2f212ee in datapath c691e2c0-bf24-480c-9af6-236639f0492c bound to our chassis#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.443 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c691e2c0-bf24-480c-9af6-236639f0492c#033[00m
Nov 29 01:52:49 np0005539504 systemd-machined[153423]: New machine qemu-13-instance-00000017.
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.462 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4f52af1c-cc3a-4b14-8680-394085fe7df1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.463 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc691e2c0-b1 in ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.466 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc691e2c0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.466 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1ca1a7-0e61-4eee-97ed-a584ee8049d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.467 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b98ddaf9-4edb-40e0-8e9d-02f225cb31c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.478 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c7b1ef-fd02-4b2c-aae7-65436b32e9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 systemd[1]: Started Virtual Machine qemu-13-instance-00000017.
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.483 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:49 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:49Z|00063|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee ovn-installed in OVS
Nov 29 01:52:49 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:49Z|00064|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee up in Southbound
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.490 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.494 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[15d06353-234d-4a63-9eda-1a17256aaeb0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.559 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2032a6de-2012-4126-8c01-f20e41493b3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.569 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf4dc37-15ab-4387-9a70-0e263618af6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 NetworkManager[55210]: <info>  [1764399169.5707] manager: (tapc691e2c0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.610 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[73a05ae4-a7f3-4c10-b54c-c555b5047f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.613 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e8fae071-a696-47c7-9469-b558993dd73e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 NetworkManager[55210]: <info>  [1764399169.6443] device (tapc691e2c0-b0): carrier: link connected
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.650 187156 DEBUG nova.network.neutron [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updated VIF entry in instance network info cache for port 1ff22547-5892-4360-8abe-429ea2f212ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.651 187156 DEBUG nova.network.neutron [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.651 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7f6c7c-5a39-442b-aa29-e8f1e84daf19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.672 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4980f809-48e5-4a73-8340-fdb817b56a9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463252, 'reachable_time': 31684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216827, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.691 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[65642187-3cb3-4861-889a-2a35c9c28ecc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:3d81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463252, 'tstamp': 463252}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216831, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.716 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eae5ab75-b518-4817-991e-afba2f9b4088]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463252, 'reachable_time': 31684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216836, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.747 187156 DEBUG oslo_concurrency.lockutils [req-eb9ff27e-92b3-45a3-9317-75202370482b req-dade1256-f3a2-40b4-8d2a-d8f467d26d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.765 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb7303a-4c92-48d4-88b9-ada5c62c7bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.799 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399169.798849, 2e380200-8276-4470-965f-31baa0bfd760 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.799 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Started (Lifecycle Event)#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.841 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50731e1a-c0d5-431f-8a93-0ae813658c51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.842 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.842 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.843 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc691e2c0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:49 np0005539504 kernel: tapc691e2c0-b0: entered promiscuous mode
Nov 29 01:52:49 np0005539504 NetworkManager[55210]: <info>  [1764399169.8907] manager: (tapc691e2c0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.896 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc691e2c0-b0, col_values=(('external_ids', {'iface-id': 'a88e36d5-5037-4505-8d26-de14faa22faf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:49 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:49Z|00065|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.911 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.917 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.919 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.920 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[05914baf-780f-49ff-bfbd-17d84e7ce868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.921 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:52:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:52:49.922 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'env', 'PROCESS_TAG=haproxy-c691e2c0-bf24-480c-9af6-236639f0492c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c691e2c0-bf24-480c-9af6-236639f0492c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:52:49 np0005539504 nova_compute[187152]: 2025-11-29 06:52:49.995 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.000 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399169.799766, 2e380200-8276-4470-965f-31baa0bfd760 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.001 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.047 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.052 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.063 187156 DEBUG nova.compute.manager [req-c8269dbc-bad7-44d6-8316-10ca7c139268 req-8df3314e-eb0b-4904-bac8-a32d45738af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.065 187156 DEBUG oslo_concurrency.lockutils [req-c8269dbc-bad7-44d6-8316-10ca7c139268 req-8df3314e-eb0b-4904-bac8-a32d45738af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.065 187156 DEBUG oslo_concurrency.lockutils [req-c8269dbc-bad7-44d6-8316-10ca7c139268 req-8df3314e-eb0b-4904-bac8-a32d45738af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.066 187156 DEBUG oslo_concurrency.lockutils [req-c8269dbc-bad7-44d6-8316-10ca7c139268 req-8df3314e-eb0b-4904-bac8-a32d45738af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.066 187156 DEBUG nova.compute.manager [req-c8269dbc-bad7-44d6-8316-10ca7c139268 req-8df3314e-eb0b-4904-bac8-a32d45738af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] No waiting events found dispatching network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.067 187156 WARNING nova.compute.manager [req-c8269dbc-bad7-44d6-8316-10ca7c139268 req-8df3314e-eb0b-4904-bac8-a32d45738af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received unexpected event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.071 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:50 np0005539504 podman[216869]: 2025-11-29 06:52:50.303834905 +0000 UTC m=+0.047040753 container create 35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:52:50 np0005539504 systemd[1]: Started libpod-conmon-35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c.scope.
Nov 29 01:52:50 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:52:50 np0005539504 podman[216869]: 2025-11-29 06:52:50.278078308 +0000 UTC m=+0.021284186 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:52:50 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/629fab6ca3073d59a45e1d0ffc0e8ca57b87671c7f748a27320918fdee3ba000/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:52:50 np0005539504 podman[216869]: 2025-11-29 06:52:50.389592424 +0000 UTC m=+0.132798302 container init 35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:52:50 np0005539504 podman[216869]: 2025-11-29 06:52:50.394360743 +0000 UTC m=+0.137566601 container start 35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:52:50 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [NOTICE]   (216888) : New worker (216890) forked
Nov 29 01:52:50 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [NOTICE]   (216888) : Loading success.
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8708] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/41)
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8719] device (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8741] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/42)
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8749] device (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8769] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8782] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8791] device (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:52:50 np0005539504 NetworkManager[55210]: <info>  [1764399170.8798] device (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:52:50 np0005539504 nova_compute[187152]: 2025-11-29 06:52:50.888 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:51 np0005539504 nova_compute[187152]: 2025-11-29 06:52:51.056 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:51 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:51Z|00066|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:52:51 np0005539504 ovn_controller[95182]: 2025-11-29T06:52:51Z|00067|binding|INFO|Releasing lport 90a33ad8-e32a-4cc0-85e0-1ed390ab00fa from this chassis (sb_readonly=0)
Nov 29 01:52:51 np0005539504 nova_compute[187152]: 2025-11-29 06:52:51.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:53 np0005539504 nova_compute[187152]: 2025-11-29 06:52:53.014 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:54 np0005539504 nova_compute[187152]: 2025-11-29 06:52:54.065 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.484 187156 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.486 187156 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.486 187156 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.487 187156 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.487 187156 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Processing event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.488 187156 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.488 187156 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.489 187156 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.489 187156 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.489 187156 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.490 187156 WARNING nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state building and task_state spawning.#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.491 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.502 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.503 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399177.5023293, 2e380200-8276-4470-965f-31baa0bfd760 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.504 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.512 187156 INFO nova.virt.libvirt.driver [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance spawned successfully.#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.513 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.545 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.552 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.553 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.553 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.554 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.555 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.555 187156 DEBUG nova.virt.libvirt.driver [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.560 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.598 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.716 187156 INFO nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Took 12.32 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.717 187156 DEBUG nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.807 187156 INFO nova.compute.manager [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Took 12.96 seconds to build instance.#033[00m
Nov 29 01:52:57 np0005539504 nova_compute[187152]: 2025-11-29 06:52:57.831 187156 DEBUG oslo_concurrency.lockutils [None req-5dfc072b-56ba-4bfa-bfc8-27812656bb7b ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.074s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:58 np0005539504 nova_compute[187152]: 2025-11-29 06:52:58.019 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:58 np0005539504 nova_compute[187152]: 2025-11-29 06:52:58.449 187156 DEBUG nova.compute.manager [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-changed-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:58 np0005539504 nova_compute[187152]: 2025-11-29 06:52:58.450 187156 DEBUG nova.compute.manager [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Refreshing instance network info cache due to event network-changed-981dbb53-0eef-491c-af47-bdee2dc23b92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:52:58 np0005539504 nova_compute[187152]: 2025-11-29 06:52:58.450 187156 DEBUG oslo_concurrency.lockutils [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:58 np0005539504 nova_compute[187152]: 2025-11-29 06:52:58.451 187156 DEBUG oslo_concurrency.lockutils [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:58 np0005539504 nova_compute[187152]: 2025-11-29 06:52:58.451 187156 DEBUG nova.network.neutron [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Refreshing network info cache for port 981dbb53-0eef-491c-af47-bdee2dc23b92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:52:58 np0005539504 podman[216901]: 2025-11-29 06:52:58.744955662 +0000 UTC m=+0.073900489 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:52:58 np0005539504 podman[216902]: 2025-11-29 06:52:58.762150168 +0000 UTC m=+0.097126509 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:52:59 np0005539504 nova_compute[187152]: 2025-11-29 06:52:59.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:00 np0005539504 nova_compute[187152]: 2025-11-29 06:53:00.403 187156 DEBUG nova.network.neutron [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updated VIF entry in instance network info cache for port 981dbb53-0eef-491c-af47-bdee2dc23b92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:53:00 np0005539504 nova_compute[187152]: 2025-11-29 06:53:00.404 187156 DEBUG nova.network.neutron [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updating instance_info_cache with network_info: [{"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:00 np0005539504 nova_compute[187152]: 2025-11-29 06:53:00.424 187156 DEBUG oslo_concurrency.lockutils [req-a429febf-a3a5-412e-b309-519cad4b1f31 req-755dd263-730d-4312-ac54-7f7894bc0793 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:00 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:00Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:c3:e8 10.100.0.6
Nov 29 01:53:00 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:00Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:c3:e8 10.100.0.6
Nov 29 01:53:01 np0005539504 nova_compute[187152]: 2025-11-29 06:53:01.781 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Check if temp file /var/lib/nova/instances/tmpg68b0jet exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 29 01:53:01 np0005539504 nova_compute[187152]: 2025-11-29 06:53:01.782 187156 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 29 01:53:02 np0005539504 nova_compute[187152]: 2025-11-29 06:53:02.629 187156 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:02 np0005539504 nova_compute[187152]: 2025-11-29 06:53:02.714 187156 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:02 np0005539504 nova_compute[187152]: 2025-11-29 06:53:02.716 187156 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:02 np0005539504 podman[216961]: 2025-11-29 06:53:02.75625231 +0000 UTC m=+0.084072155 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:53:02 np0005539504 nova_compute[187152]: 2025-11-29 06:53:02.832 187156 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:03 np0005539504 nova_compute[187152]: 2025-11-29 06:53:03.023 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.069 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.724 187156 DEBUG nova.compute.manager [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-changed-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.724 187156 DEBUG nova.compute.manager [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Refreshing instance network info cache due to event network-changed-981dbb53-0eef-491c-af47-bdee2dc23b92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.725 187156 DEBUG oslo_concurrency.lockutils [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.725 187156 DEBUG oslo_concurrency.lockutils [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.726 187156 DEBUG nova.network.neutron [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Refreshing network info cache for port 981dbb53-0eef-491c-af47-bdee2dc23b92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.781 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.781 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.782 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.782 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.782 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.797 187156 INFO nova.compute.manager [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Terminating instance#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.811 187156 DEBUG nova.compute.manager [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:53:04 np0005539504 kernel: tap981dbb53-0e (unregistering): left promiscuous mode
Nov 29 01:53:04 np0005539504 NetworkManager[55210]: <info>  [1764399184.8421] device (tap981dbb53-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:53:04 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:04Z|00068|binding|INFO|Releasing lport 981dbb53-0eef-491c-af47-bdee2dc23b92 from this chassis (sb_readonly=0)
Nov 29 01:53:04 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:04Z|00069|binding|INFO|Setting lport 981dbb53-0eef-491c-af47-bdee2dc23b92 down in Southbound
Nov 29 01:53:04 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:04Z|00070|binding|INFO|Removing iface tap981dbb53-0e ovn-installed in OVS
Nov 29 01:53:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:04.913 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:c3:e8 10.100.0.6'], port_security=['fa:16:3e:a5:c3:e8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '95b3301b-b068-4bbe-bc2b-1e83aba9eadf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f562e81-d2bf-4e2c-b0ea-0aa5dfe52d68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86b613b3-d246-4b07-a5b7-9ab1b7da74dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=981dbb53-0eef-491c-af47-bdee2dc23b92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:04.914 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 981dbb53-0eef-491c-af47-bdee2dc23b92 in datapath 3c63c551-2e9f-4b47-9e49-c73140efe20a unbound from our chassis#033[00m
Nov 29 01:53:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:04.916 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c63c551-2e9f-4b47-9e49-c73140efe20a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:04.919 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[468ad0cd-50b4-4250-a4e4-5e6763687937]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:04 np0005539504 nova_compute[187152]: 2025-11-29 06:53:04.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:04.921 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a namespace which is not needed anymore#033[00m
Nov 29 01:53:04 np0005539504 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000016.scope: Deactivated successfully.
Nov 29 01:53:04 np0005539504 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000016.scope: Consumed 13.191s CPU time.
Nov 29 01:53:04 np0005539504 systemd-machined[153423]: Machine qemu-12-instance-00000016 terminated.
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.032 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.037 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [NOTICE]   (216781) : haproxy version is 2.8.14-c23fe91
Nov 29 01:53:05 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [NOTICE]   (216781) : path to executable is /usr/sbin/haproxy
Nov 29 01:53:05 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [WARNING]  (216781) : Exiting Master process...
Nov 29 01:53:05 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [ALERT]    (216781) : Current worker (216783) exited with code 143 (Terminated)
Nov 29 01:53:05 np0005539504 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216777]: [WARNING]  (216781) : All workers exited. Exiting... (0)
Nov 29 01:53:05 np0005539504 systemd[1]: libpod-4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4.scope: Deactivated successfully.
Nov 29 01:53:05 np0005539504 podman[217009]: 2025-11-29 06:53:05.063365061 +0000 UTC m=+0.054892095 container died 4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.069 187156 INFO nova.virt.libvirt.driver [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Instance destroyed successfully.#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.069 187156 DEBUG nova.objects.instance [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lazy-loading 'resources' on Instance uuid 95b3301b-b068-4bbe-bc2b-1e83aba9eadf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.088 187156 DEBUG nova.virt.libvirt.vif [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-36479523',display_name='tempest-FloatingIPsAssociationTestJSON-server-36479523',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-36479523',id=22,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71af3e88884e42c48fb244d7d6ca31e2',ramdisk_id='',reservation_id='r-ikud3p2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-940149563',owner_user_name='tempest-FloatingIPsAssociationTestJSON-940149563-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:52:48Z,user_data=None,user_id='a0fcd4f4de7e4072be30f7e3d4ac7c77',uuid=95b3301b-b068-4bbe-bc2b-1e83aba9eadf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.088 187156 DEBUG nova.network.os_vif_util [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converting VIF {"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:05 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4-userdata-shm.mount: Deactivated successfully.
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.089 187156 DEBUG nova.network.os_vif_util [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.090 187156 DEBUG os_vif [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:53:05 np0005539504 systemd[1]: var-lib-containers-storage-overlay-df5a81801d44020f95ecd136ee15bafa652bef7d555ea68ff0afc2c27f70fe4b-merged.mount: Deactivated successfully.
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.094 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.094 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap981dbb53-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.097 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 podman[217009]: 2025-11-29 06:53:05.102184281 +0000 UTC m=+0.093711315 container cleanup 4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.103 187156 INFO os_vif [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:c3:e8,bridge_name='br-int',has_traffic_filtering=True,id=981dbb53-0eef-491c-af47-bdee2dc23b92,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap981dbb53-0e')#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.103 187156 INFO nova.virt.libvirt.driver [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Deleting instance files /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf_del#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.104 187156 INFO nova.virt.libvirt.driver [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Deletion of /var/lib/nova/instances/95b3301b-b068-4bbe-bc2b-1e83aba9eadf_del complete#033[00m
Nov 29 01:53:05 np0005539504 systemd[1]: libpod-conmon-4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4.scope: Deactivated successfully.
Nov 29 01:53:05 np0005539504 podman[217052]: 2025-11-29 06:53:05.17390203 +0000 UTC m=+0.044648698 container remove 4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.182 187156 INFO nova.compute.manager [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.182 187156 DEBUG oslo.service.loopingcall [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.183 187156 DEBUG nova.compute.manager [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.183 187156 DEBUG nova.network.neutron [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.187 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1494144b-aa69-4bca-b8f1-05338e29926e]: (4, ('Sat Nov 29 06:53:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a (4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4)\n4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4\nSat Nov 29 06:53:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a (4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4)\n4c946df77438ffbaba2906f91f153e856a37c3abbf0cfdce73dc027a08b283b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.188 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ef199d69-9c48-4495-9e4a-4a5242608d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.189 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c63c551-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.191 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 kernel: tap3c63c551-20: left promiscuous mode
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.205 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.206 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.210 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca75eee-8c94-4387-814a-cb01c8fd24b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.225 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[695a1b0c-b3dc-468b-84d4-a7c50e37dbb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.227 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[83619268-d9fe-41a3-a72d-0217908eef01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.248 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ed782148-f197-4c94-ab90-1bdd49462d94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463023, 'reachable_time': 23989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217069, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 systemd[1]: run-netns-ovnmeta\x2d3c63c551\x2d2e9f\x2d4b47\x2d9e49\x2dc73140efe20a.mount: Deactivated successfully.
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.254 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:53:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:05.255 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3c98a558-6f83-470b-a92c-249ce15b9ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:05 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:53:05 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:53:05 np0005539504 systemd-logind[783]: New session 29 of user nova.
Nov 29 01:53:05 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:53:05 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:53:05 np0005539504 systemd[217072]: Queued start job for default target Main User Target.
Nov 29 01:53:05 np0005539504 systemd[217072]: Created slice User Application Slice.
Nov 29 01:53:05 np0005539504 systemd[217072]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:53:05 np0005539504 systemd[217072]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:53:05 np0005539504 systemd[217072]: Reached target Paths.
Nov 29 01:53:05 np0005539504 systemd[217072]: Reached target Timers.
Nov 29 01:53:05 np0005539504 systemd[217072]: Starting D-Bus User Message Bus Socket...
Nov 29 01:53:05 np0005539504 systemd[217072]: Starting Create User's Volatile Files and Directories...
Nov 29 01:53:05 np0005539504 systemd[217072]: Finished Create User's Volatile Files and Directories.
Nov 29 01:53:05 np0005539504 systemd[217072]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:53:05 np0005539504 systemd[217072]: Reached target Sockets.
Nov 29 01:53:05 np0005539504 systemd[217072]: Reached target Basic System.
Nov 29 01:53:05 np0005539504 systemd[217072]: Reached target Main User Target.
Nov 29 01:53:05 np0005539504 systemd[217072]: Startup finished in 143ms.
Nov 29 01:53:05 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:53:05 np0005539504 systemd[1]: Started Session 29 of User nova.
Nov 29 01:53:05 np0005539504 systemd[1]: session-29.scope: Deactivated successfully.
Nov 29 01:53:05 np0005539504 systemd-logind[783]: Session 29 logged out. Waiting for processes to exit.
Nov 29 01:53:05 np0005539504 systemd-logind[783]: Removed session 29.
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.910 187156 DEBUG nova.compute.manager [req-ce4b27a8-ff8d-4aa7-aeb8-cdce6f816e95 req-07df422c-9ecd-41b0-ab54-b940110d7eab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-vif-unplugged-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.911 187156 DEBUG oslo_concurrency.lockutils [req-ce4b27a8-ff8d-4aa7-aeb8-cdce6f816e95 req-07df422c-9ecd-41b0-ab54-b940110d7eab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.911 187156 DEBUG oslo_concurrency.lockutils [req-ce4b27a8-ff8d-4aa7-aeb8-cdce6f816e95 req-07df422c-9ecd-41b0-ab54-b940110d7eab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.911 187156 DEBUG oslo_concurrency.lockutils [req-ce4b27a8-ff8d-4aa7-aeb8-cdce6f816e95 req-07df422c-9ecd-41b0-ab54-b940110d7eab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.911 187156 DEBUG nova.compute.manager [req-ce4b27a8-ff8d-4aa7-aeb8-cdce6f816e95 req-07df422c-9ecd-41b0-ab54-b940110d7eab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] No waiting events found dispatching network-vif-unplugged-981dbb53-0eef-491c-af47-bdee2dc23b92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.912 187156 DEBUG nova.compute.manager [req-ce4b27a8-ff8d-4aa7-aeb8-cdce6f816e95 req-07df422c-9ecd-41b0-ab54-b940110d7eab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-vif-unplugged-981dbb53-0eef-491c-af47-bdee2dc23b92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.955 187156 DEBUG nova.network.neutron [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:05 np0005539504 nova_compute[187152]: 2025-11-29 06:53:05.982 187156 INFO nova.compute.manager [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Took 0.80 seconds to deallocate network for instance.#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.174 187156 DEBUG nova.compute.manager [req-063ef000-fed9-46a3-a523-fec5cce44961 req-3669e852-68eb-402d-9d30-c90c7fd12f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-vif-deleted-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.250 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.251 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.289 187156 DEBUG nova.scheduler.client.report [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.327 187156 DEBUG nova.scheduler.client.report [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.328 187156 DEBUG nova.compute.provider_tree [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.352 187156 DEBUG nova.scheduler.client.report [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.374 187156 DEBUG nova.scheduler.client.report [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.441 187156 DEBUG nova.compute.provider_tree [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.457 187156 DEBUG nova.scheduler.client.report [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.494 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.552 187156 INFO nova.scheduler.client.report [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Deleted allocations for instance 95b3301b-b068-4bbe-bc2b-1e83aba9eadf#033[00m
Nov 29 01:53:06 np0005539504 nova_compute[187152]: 2025-11-29 06:53:06.737 187156 DEBUG oslo_concurrency.lockutils [None req-2e24a8dd-3ee8-4ec3-95af-2484890d826b a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:07 np0005539504 nova_compute[187152]: 2025-11-29 06:53:07.530 187156 DEBUG nova.network.neutron [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updated VIF entry in instance network info cache for port 981dbb53-0eef-491c-af47-bdee2dc23b92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:53:07 np0005539504 nova_compute[187152]: 2025-11-29 06:53:07.530 187156 DEBUG nova.network.neutron [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Updating instance_info_cache with network_info: [{"id": "981dbb53-0eef-491c-af47-bdee2dc23b92", "address": "fa:16:3e:a5:c3:e8", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap981dbb53-0e", "ovs_interfaceid": "981dbb53-0eef-491c-af47-bdee2dc23b92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:07 np0005539504 nova_compute[187152]: 2025-11-29 06:53:07.661 187156 DEBUG oslo_concurrency.lockutils [req-482193e3-84f3-4bd4-8ea7-604a78c3bd2d req-6ecaf5d6-67a2-4daa-9e86-a972a2226c39 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-95b3301b-b068-4bbe-bc2b-1e83aba9eadf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:07 np0005539504 nova_compute[187152]: 2025-11-29 06:53:07.982 187156 INFO nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Took 5.15 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.#033[00m
Nov 29 01:53:07 np0005539504 nova_compute[187152]: 2025-11-29 06:53:07.983 187156 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.020 187156 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(7f47b478-93d7-48e7-a5c6-8e895d728c8b),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.055 187156 DEBUG nova.objects.instance [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'migration_context' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.056 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.058 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.058 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.077 187156 DEBUG nova.virt.libvirt.vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:52:57Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.078 187156 DEBUG nova.network.os_vif_util [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.078 187156 DEBUG nova.network.os_vif_util [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.079 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 01:53:08 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:56:a7:a1"/>
Nov 29 01:53:08 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 01:53:08 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:53:08 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 01:53:08 np0005539504 nova_compute[187152]:  <target dev="tap1ff22547-58"/>
Nov 29 01:53:08 np0005539504 nova_compute[187152]: </interface>
Nov 29 01:53:08 np0005539504 nova_compute[187152]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.079 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.561 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.562 187156 INFO nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.616 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.617 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.617 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.617 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b3301b-b068-4bbe-bc2b-1e83aba9eadf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.618 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] No waiting events found dispatching network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.618 187156 WARNING nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Received unexpected event network-vif-plugged-981dbb53-0eef-491c-af47-bdee2dc23b92 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.618 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.618 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.619 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.620 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.620 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.621 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.621 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.621 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.622 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.622 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.622 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.622 187156 WARNING nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.623 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-changed-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.623 187156 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Refreshing instance network info cache due to event network-changed-1ff22547-5892-4360-8abe-429ea2f212ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.623 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.624 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.624 187156 DEBUG nova.network.neutron [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Refreshing network info cache for port 1ff22547-5892-4360-8abe-429ea2f212ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:53:08 np0005539504 nova_compute[187152]: 2025-11-29 06:53:08.719 187156 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 29 01:53:08 np0005539504 podman[217089]: 2025-11-29 06:53:08.724199761 +0000 UTC m=+0.067559728 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.070 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.222 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.222 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.724 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.725 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.780 187156 DEBUG nova.network.neutron [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updated VIF entry in instance network info cache for port 1ff22547-5892-4360-8abe-429ea2f212ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.780 187156 DEBUG nova.network.neutron [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:09 np0005539504 nova_compute[187152]: 2025-11-29 06:53:09.811 187156 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.097 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.229 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.230 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.594 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399190.5941586, 2e380200-8276-4470-965f-31baa0bfd760 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.595 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.633 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.639 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.674 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.735 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.736 187156 DEBUG nova.virt.libvirt.migration [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:10 np0005539504 kernel: tap1ff22547-58 (unregistering): left promiscuous mode
Nov 29 01:53:10 np0005539504 NetworkManager[55210]: <info>  [1764399190.7844] device (tap1ff22547-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:53:10 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:10Z|00071|binding|INFO|Releasing lport 1ff22547-5892-4360-8abe-429ea2f212ee from this chassis (sb_readonly=0)
Nov 29 01:53:10 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:10Z|00072|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee down in Southbound
Nov 29 01:53:10 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:10Z|00073|binding|INFO|Removing iface tap1ff22547-58 ovn-installed in OVS
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.795 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:10.801 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:a1 10.100.0.5'], port_security=['fa:16:3e:56:a7:a1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2e380200-8276-4470-965f-31baa0bfd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1ff22547-5892-4360-8abe-429ea2f212ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:10.803 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1ff22547-5892-4360-8abe-429ea2f212ee in datapath c691e2c0-bf24-480c-9af6-236639f0492c unbound from our chassis#033[00m
Nov 29 01:53:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:10.804 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c691e2c0-bf24-480c-9af6-236639f0492c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:53:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:10.806 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc3cc1e-bedf-4c06-a55d-97b43fb99c17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:10.807 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace which is not needed anymore#033[00m
Nov 29 01:53:10 np0005539504 nova_compute[187152]: 2025-11-29 06:53:10.811 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:10 np0005539504 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 29 01:53:10 np0005539504 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000017.scope: Consumed 13.309s CPU time.
Nov 29 01:53:10 np0005539504 systemd-machined[153423]: Machine qemu-13-instance-00000017 terminated.
Nov 29 01:53:10 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [NOTICE]   (216888) : haproxy version is 2.8.14-c23fe91
Nov 29 01:53:10 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [NOTICE]   (216888) : path to executable is /usr/sbin/haproxy
Nov 29 01:53:10 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [WARNING]  (216888) : Exiting Master process...
Nov 29 01:53:10 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [ALERT]    (216888) : Current worker (216890) exited with code 143 (Terminated)
Nov 29 01:53:10 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[216884]: [WARNING]  (216888) : All workers exited. Exiting... (0)
Nov 29 01:53:10 np0005539504 systemd[1]: libpod-35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c.scope: Deactivated successfully.
Nov 29 01:53:10 np0005539504 podman[217152]: 2025-11-29 06:53:10.962942324 +0000 UTC m=+0.048841142 container died 35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:53:11 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c-userdata-shm.mount: Deactivated successfully.
Nov 29 01:53:11 np0005539504 systemd[1]: var-lib-containers-storage-overlay-629fab6ca3073d59a45e1d0ffc0e8ca57b87671c7f748a27320918fdee3ba000-merged.mount: Deactivated successfully.
Nov 29 01:53:11 np0005539504 podman[217152]: 2025-11-29 06:53:11.017245612 +0000 UTC m=+0.103144440 container cleanup 35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:53:11 np0005539504 systemd[1]: libpod-conmon-35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c.scope: Deactivated successfully.
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.040 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.042 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.042 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 29 01:53:11 np0005539504 podman[217202]: 2025-11-29 06:53:11.094453121 +0000 UTC m=+0.046767497 container remove 35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.100 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7d878e7f-1427-43df-b0c2-637ccf08a1d7]: (4, ('Sat Nov 29 06:53:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c)\n35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c\nSat Nov 29 06:53:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c)\n35314cdfdca5aa745c985fa41c6f18977a6bb168ea03fadad122cac7b9d3325c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.101 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[14b2ed0d-e673-4a9f-9b2f-2297ea4626bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.102 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:11 np0005539504 kernel: tapc691e2c0-b0: left promiscuous mode
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.105 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.123 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.127 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[549656c8-98d8-4cab-965b-79aa43758602]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.139 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[60f8ecfb-c340-46c6-a15c-0aa192d2a54b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.140 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bd701808-f883-414b-b821-c5d1cc1a452a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.153 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fdf8d8-5d20-4e8d-8139-5c27aabcc4b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463243, 'reachable_time': 33926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217222, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.155 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:53:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:11.155 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[abe4e7c9-5f11-4c0a-8eff-ad809d13d2a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:11 np0005539504 systemd[1]: run-netns-ovnmeta\x2dc691e2c0\x2dbf24\x2d480c\x2d9af6\x2d236639f0492c.mount: Deactivated successfully.
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.239 187156 DEBUG nova.virt.libvirt.guest [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '2e380200-8276-4470-965f-31baa0bfd760' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.240 187156 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration operation has completed#033[00m
Nov 29 01:53:11 np0005539504 nova_compute[187152]: 2025-11-29 06:53:11.240 187156 INFO nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] _post_live_migration() is started..#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.376 187156 DEBUG nova.compute.manager [req-0260a701-895e-419f-a371-012ee64cf2b5 req-c4bed255-7925-4b4c-ba01-cf8e2545c03b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.377 187156 DEBUG oslo_concurrency.lockutils [req-0260a701-895e-419f-a371-012ee64cf2b5 req-c4bed255-7925-4b4c-ba01-cf8e2545c03b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.377 187156 DEBUG oslo_concurrency.lockutils [req-0260a701-895e-419f-a371-012ee64cf2b5 req-c4bed255-7925-4b4c-ba01-cf8e2545c03b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.377 187156 DEBUG oslo_concurrency.lockutils [req-0260a701-895e-419f-a371-012ee64cf2b5 req-c4bed255-7925-4b4c-ba01-cf8e2545c03b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.377 187156 DEBUG nova.compute.manager [req-0260a701-895e-419f-a371-012ee64cf2b5 req-c4bed255-7925-4b4c-ba01-cf8e2545c03b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.378 187156 DEBUG nova.compute.manager [req-0260a701-895e-419f-a371-012ee64cf2b5 req-c4bed255-7925-4b4c-ba01-cf8e2545c03b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.665 187156 DEBUG nova.network.neutron [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Activated binding for port 1ff22547-5892-4360-8abe-429ea2f212ee and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.666 187156 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.668 187156 DEBUG nova.virt.libvirt.vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:01Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.669 187156 DEBUG nova.network.os_vif_util [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.670 187156 DEBUG nova.network.os_vif_util [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.671 187156 DEBUG os_vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.675 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.676 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ff22547-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.679 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.681 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.685 187156 INFO os_vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58')#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.686 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.687 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.688 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.688 187156 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.689 187156 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deleting instance files /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760_del#033[00m
Nov 29 01:53:13 np0005539504 nova_compute[187152]: 2025-11-29 06:53:13.691 187156 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deletion of /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760_del complete#033[00m
Nov 29 01:53:14 np0005539504 nova_compute[187152]: 2025-11-29 06:53:14.073 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.526 187156 DEBUG nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.527 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.527 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.527 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.527 187156 DEBUG nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.527 187156 WARNING nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.528 187156 DEBUG nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.528 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.528 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.528 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.528 187156 DEBUG nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.529 187156 WARNING nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.529 187156 DEBUG nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.529 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.529 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.529 187156 DEBUG oslo_concurrency.lockutils [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.530 187156 DEBUG nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:15 np0005539504 nova_compute[187152]: 2025-11-29 06:53:15.530 187156 WARNING nova.compute.manager [req-88a05885-f5dc-4b6a-b6a3-df4bf93d7df4 req-2664c81b-9043-49b0-a721-8502f0ecae27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:15 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:53:15 np0005539504 systemd[217072]: Activating special unit Exit the Session...
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped target Main User Target.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped target Basic System.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped target Paths.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped target Sockets.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped target Timers.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:53:15 np0005539504 systemd[217072]: Closed D-Bus User Message Bus Socket.
Nov 29 01:53:15 np0005539504 systemd[217072]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:53:15 np0005539504 systemd[217072]: Removed slice User Application Slice.
Nov 29 01:53:15 np0005539504 systemd[217072]: Reached target Shutdown.
Nov 29 01:53:15 np0005539504 systemd[217072]: Finished Exit the Session.
Nov 29 01:53:15 np0005539504 systemd[217072]: Reached target Exit the Session.
Nov 29 01:53:15 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:53:15 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:53:15 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:53:15 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:53:15 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:53:15 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:53:15 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:53:15 np0005539504 podman[217224]: 2025-11-29 06:53:15.754496042 +0000 UTC m=+0.082902053 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:53:18 np0005539504 nova_compute[187152]: 2025-11-29 06:53:18.679 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539504 nova_compute[187152]: 2025-11-29 06:53:19.074 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539504 podman[217245]: 2025-11-29 06:53:19.716029603 +0000 UTC m=+0.057401534 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 01:53:19 np0005539504 podman[217244]: 2025-11-29 06:53:19.730510576 +0000 UTC m=+0.074431486 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:53:20 np0005539504 nova_compute[187152]: 2025-11-29 06:53:20.068 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399185.0672593, 95b3301b-b068-4bbe-bc2b-1e83aba9eadf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:20 np0005539504 nova_compute[187152]: 2025-11-29 06:53:20.069 187156 INFO nova.compute.manager [-] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:53:20 np0005539504 nova_compute[187152]: 2025-11-29 06:53:20.198 187156 DEBUG nova.compute.manager [None req-ab267089-46e6-4fd5-865a-6fbbc98f032f - - - - - -] [instance: 95b3301b-b068-4bbe-bc2b-1e83aba9eadf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:20.932 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:20.933 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:53:20 np0005539504 nova_compute[187152]: 2025-11-29 06:53:20.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.118 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.119 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.119 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.145 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.146 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.146 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.146 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.350 187156 WARNING nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.351 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.24018859863281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.351 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.352 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.387 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Migration for instance 2e380200-8276-4470-965f-31baa0bfd760 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.408 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.496 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Migration 7f47b478-93d7-48e7-a5c6-8e895d728c8b is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.497 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.497 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.550 187156 DEBUG nova.compute.provider_tree [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.567 187156 DEBUG nova.scheduler.client.report [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.592 187156 DEBUG nova.compute.resource_tracker [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.593 187156 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.619 187156 INFO nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migrating instance to compute-2.ctlplane.example.com finished successfully.#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.737 187156 INFO nova.scheduler.client.report [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Deleted allocation for migration 7f47b478-93d7-48e7-a5c6-8e895d728c8b#033[00m
Nov 29 01:53:21 np0005539504 nova_compute[187152]: 2025-11-29 06:53:21.738 187156 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 29 01:53:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:22.908 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:22.909 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:22.909 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:23 np0005539504 nova_compute[187152]: 2025-11-29 06:53:23.700 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:24 np0005539504 nova_compute[187152]: 2025-11-29 06:53:24.076 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:24 np0005539504 nova_compute[187152]: 2025-11-29 06:53:24.227 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating tmpfile /var/lib/nova/instances/tmpy70gw60a to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 01:53:24 np0005539504 nova_compute[187152]: 2025-11-29 06:53:24.228 187156 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 01:53:25 np0005539504 nova_compute[187152]: 2025-11-29 06:53:25.834 187156 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 01:53:25 np0005539504 nova_compute[187152]: 2025-11-29 06:53:25.927 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:25 np0005539504 nova_compute[187152]: 2025-11-29 06:53:25.927 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:25 np0005539504 nova_compute[187152]: 2025-11-29 06:53:25.927 187156 DEBUG nova.network.neutron [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:53:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:25.937 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:26 np0005539504 nova_compute[187152]: 2025-11-29 06:53:26.036 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399191.0360556, 2e380200-8276-4470-965f-31baa0bfd760 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:26 np0005539504 nova_compute[187152]: 2025-11-29 06:53:26.037 187156 INFO nova.compute.manager [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:53:26 np0005539504 nova_compute[187152]: 2025-11-29 06:53:26.060 187156 DEBUG nova.compute.manager [None req-21ebabcf-5dd9-4807-b778-6eae7b1c08a9 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.697 187156 DEBUG nova.network.neutron [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.725 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.750 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.750 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating instance directory: /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.751 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating disk.info with the contents: {'/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk': 'qcow2', '/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.751 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.752 187156 DEBUG nova.objects.instance [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.828 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.886 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.888 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.889 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.911 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.994 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:27 np0005539504 nova_compute[187152]: 2025-11-29 06:53:27.995 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.036 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.038 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.038 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.099 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.101 187156 DEBUG nova.virt.disk.api [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Checking if we can resize image /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.102 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.156 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.157 187156 DEBUG nova.virt.disk.api [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Cannot resize image /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.158 187156 DEBUG nova.objects.instance [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'migration_context' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.171 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.193 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.195 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config to /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.195 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.304 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.570 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.694 187156 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.695 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.696 187156 DEBUG nova.virt.libvirt.vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:20Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.696 187156 DEBUG nova.network.os_vif_util [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.697 187156 DEBUG nova.network.os_vif_util [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.698 187156 DEBUG os_vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.699 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.699 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.700 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.701 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.703 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.703 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ff22547-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.704 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ff22547-58, col_values=(('external_ids', {'iface-id': '1ff22547-5892-4360-8abe-429ea2f212ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:a7:a1', 'vm-uuid': '2e380200-8276-4470-965f-31baa0bfd760'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:28 np0005539504 NetworkManager[55210]: <info>  [1764399208.7066] manager: (tap1ff22547-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.708 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.715 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.716 187156 INFO os_vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58')#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.717 187156 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 01:53:28 np0005539504 nova_compute[187152]: 2025-11-29 06:53:28.717 187156 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 01:53:29 np0005539504 nova_compute[187152]: 2025-11-29 06:53:29.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:29 np0005539504 podman[217312]: 2025-11-29 06:53:29.732176481 +0000 UTC m=+0.072969174 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:53:29 np0005539504 podman[217313]: 2025-11-29 06:53:29.769389327 +0000 UTC m=+0.107732664 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.316 187156 DEBUG nova.network.neutron [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Port 1ff22547-5892-4360-8abe-429ea2f212ee updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.328 187156 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 01:53:30 np0005539504 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:53:30 np0005539504 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:53:30 np0005539504 kernel: tap1ff22547-58: entered promiscuous mode
Nov 29 01:53:30 np0005539504 NetworkManager[55210]: <info>  [1764399210.6020] manager: (tap1ff22547-58): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.604 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:30 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:30Z|00074|binding|INFO|Claiming lport 1ff22547-5892-4360-8abe-429ea2f212ee for this additional chassis.
Nov 29 01:53:30 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:30Z|00075|binding|INFO|1ff22547-5892-4360-8abe-429ea2f212ee: Claiming fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.607 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:30 np0005539504 systemd-udevd[217389]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:30 np0005539504 NetworkManager[55210]: <info>  [1764399210.6508] device (tap1ff22547-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:53:30 np0005539504 NetworkManager[55210]: <info>  [1764399210.6520] device (tap1ff22547-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:53:30 np0005539504 systemd-machined[153423]: New machine qemu-14-instance-00000017.
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.678 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:30 np0005539504 systemd[1]: Started Virtual Machine qemu-14-instance-00000017.
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.683 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:30 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:30Z|00076|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee ovn-installed in OVS
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.687 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:30 np0005539504 nova_compute[187152]: 2025-11-29 06:53:30.688 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:32 np0005539504 nova_compute[187152]: 2025-11-29 06:53:32.046 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399212.045446, 2e380200-8276-4470-965f-31baa0bfd760 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:32 np0005539504 nova_compute[187152]: 2025-11-29 06:53:32.046 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Started (Lifecycle Event)#033[00m
Nov 29 01:53:32 np0005539504 nova_compute[187152]: 2025-11-29 06:53:32.068 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:33 np0005539504 nova_compute[187152]: 2025-11-29 06:53:33.062 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399213.0624382, 2e380200-8276-4470-965f-31baa0bfd760 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:33 np0005539504 nova_compute[187152]: 2025-11-29 06:53:33.063 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:53:33 np0005539504 nova_compute[187152]: 2025-11-29 06:53:33.093 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:33 np0005539504 nova_compute[187152]: 2025-11-29 06:53:33.098 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:53:33 np0005539504 nova_compute[187152]: 2025-11-29 06:53:33.156 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 01:53:33 np0005539504 nova_compute[187152]: 2025-11-29 06:53:33.706 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539504 podman[217418]: 2025-11-29 06:53:33.777696193 +0000 UTC m=+0.104768354 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm)
Nov 29 01:53:34 np0005539504 nova_compute[187152]: 2025-11-29 06:53:34.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:34Z|00077|binding|INFO|Claiming lport 1ff22547-5892-4360-8abe-429ea2f212ee for this chassis.
Nov 29 01:53:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:34Z|00078|binding|INFO|1ff22547-5892-4360-8abe-429ea2f212ee: Claiming fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:53:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:34Z|00079|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee up in Southbound
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.558 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:a1 10.100.0.5'], port_security=['fa:16:3e:56:a7:a1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2e380200-8276-4470-965f-31baa0bfd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '20', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1ff22547-5892-4360-8abe-429ea2f212ee) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.559 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1ff22547-5892-4360-8abe-429ea2f212ee in datapath c691e2c0-bf24-480c-9af6-236639f0492c bound to our chassis#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.560 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c691e2c0-bf24-480c-9af6-236639f0492c#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.576 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4af01a14-4758-450c-b250-63915165fd36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.578 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc691e2c0-b1 in ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.580 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc691e2c0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.581 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6300c6-4a33-4db2-8229-ffcf8f248af0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.581 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8a17f7-0376-491a-8244-343c618f52ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.599 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[31953180-9bdb-4419-bcde-20c46dcbda10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.613 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aaef1b01-99de-42a7-a8fe-318d39a3e67c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.648 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4011633d-effb-495b-898c-7f0226926b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 NetworkManager[55210]: <info>  [1764399214.6591] manager: (tapc691e2c0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.657 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0861a1-7b1a-4c4a-82a3-5368ec5e1750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 systemd-udevd[217446]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.688 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4a3e7af2-f1b7-4d78-955f-b38951349206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.694 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[38532ca4-71b2-4507-81c2-b610aa875172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 nova_compute[187152]: 2025-11-29 06:53:34.710 187156 INFO nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Post operation of migration started#033[00m
Nov 29 01:53:34 np0005539504 NetworkManager[55210]: <info>  [1764399214.7205] device (tapc691e2c0-b0): carrier: link connected
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.726 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[59c9dec2-bcd6-4ff1-bd82-177162d64592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.743 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fca8c0ab-7551-41d8-8b94-ef171066ec81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467760, 'reachable_time': 25936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217465, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.757 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c59788-c620-44bd-a0a0-e92a857a9cc7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:3d81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467760, 'tstamp': 467760}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217466, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.772 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b0335bb5-1cd2-460c-9326-0f250ff8277e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467760, 'reachable_time': 25936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217467, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.805 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9c411ce0-7d5f-4581-bda3-c47e22defaf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.858 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9f38b4c4-e0af-4653-895e-35eaaf3e3326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.860 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.860 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.861 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc691e2c0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:34 np0005539504 nova_compute[187152]: 2025-11-29 06:53:34.863 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:34 np0005539504 NetworkManager[55210]: <info>  [1764399214.8636] manager: (tapc691e2c0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 29 01:53:34 np0005539504 kernel: tapc691e2c0-b0: entered promiscuous mode
Nov 29 01:53:34 np0005539504 nova_compute[187152]: 2025-11-29 06:53:34.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.867 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc691e2c0-b0, col_values=(('external_ids', {'iface-id': 'a88e36d5-5037-4505-8d26-de14faa22faf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:34 np0005539504 nova_compute[187152]: 2025-11-29 06:53:34.869 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:34 np0005539504 ovn_controller[95182]: 2025-11-29T06:53:34Z|00080|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.872 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.873 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[14cc94ca-df32-40bf-951e-cfd1889aca5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.874 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:53:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:53:34.874 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'env', 'PROCESS_TAG=haproxy-c691e2c0-bf24-480c-9af6-236639f0492c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c691e2c0-bf24-480c-9af6-236639f0492c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:53:34 np0005539504 nova_compute[187152]: 2025-11-29 06:53:34.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:35 np0005539504 nova_compute[187152]: 2025-11-29 06:53:35.256 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:35 np0005539504 nova_compute[187152]: 2025-11-29 06:53:35.257 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:35 np0005539504 nova_compute[187152]: 2025-11-29 06:53:35.257 187156 DEBUG nova.network.neutron [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:53:35 np0005539504 podman[217500]: 2025-11-29 06:53:35.277964006 +0000 UTC m=+0.059175242 container create 32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 01:53:35 np0005539504 systemd[1]: Started libpod-conmon-32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76.scope.
Nov 29 01:53:35 np0005539504 podman[217500]: 2025-11-29 06:53:35.244922832 +0000 UTC m=+0.026134078 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:53:35 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:53:35 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad5ab4155148d837a58dda34f70edbc45147108a88c310dbfcd769d4a8cd4ff7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:53:35 np0005539504 podman[217500]: 2025-11-29 06:53:35.368495063 +0000 UTC m=+0.149706329 container init 32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 01:53:35 np0005539504 podman[217500]: 2025-11-29 06:53:35.374514137 +0000 UTC m=+0.155725363 container start 32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:53:35 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [NOTICE]   (217519) : New worker (217521) forked
Nov 29 01:53:35 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [NOTICE]   (217519) : Loading success.
Nov 29 01:53:36 np0005539504 nova_compute[187152]: 2025-11-29 06:53:36.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:37 np0005539504 nova_compute[187152]: 2025-11-29 06:53:37.236 187156 DEBUG nova.network.neutron [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:37 np0005539504 nova_compute[187152]: 2025-11-29 06:53:37.267 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:37 np0005539504 nova_compute[187152]: 2025-11-29 06:53:37.293 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:37 np0005539504 nova_compute[187152]: 2025-11-29 06:53:37.294 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:37 np0005539504 nova_compute[187152]: 2025-11-29 06:53:37.294 187156 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:37 np0005539504 nova_compute[187152]: 2025-11-29 06:53:37.300 187156 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 01:53:37 np0005539504 virtqemud[186569]: Domain id=14 name='instance-00000017' uuid=2e380200-8276-4470-965f-31baa0bfd760 is tainted: custom-monitor
Nov 29 01:53:38 np0005539504 nova_compute[187152]: 2025-11-29 06:53:38.310 187156 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 01:53:38 np0005539504 nova_compute[187152]: 2025-11-29 06:53:38.757 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:38 np0005539504 nova_compute[187152]: 2025-11-29 06:53:38.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:38 np0005539504 nova_compute[187152]: 2025-11-29 06:53:38.950 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:38 np0005539504 nova_compute[187152]: 2025-11-29 06:53:38.951 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.316 187156 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.323 187156 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.346 187156 DEBUG nova.objects.instance [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:53:39 np0005539504 podman[217530]: 2025-11-29 06:53:39.735721438 +0000 UTC m=+0.071288729 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.980 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.980 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.981 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:39 np0005539504 nova_compute[187152]: 2025-11-29 06:53:39.981 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.114 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.187 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.189 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.252 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.438 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.439 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5577MB free_disk=73.21106338500977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.439 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.439 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.488 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Applying migration context for instance 2e380200-8276-4470-965f-31baa0bfd760 as it has an incoming, in-progress migration f9e8d008-f146-49a2-a2c0-835a0311a251. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.489 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.490 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.517 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 2e380200-8276-4470-965f-31baa0bfd760 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.517 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.518 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.585 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.603 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.630 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:53:40 np0005539504 nova_compute[187152]: 2025-11-29 06:53:40.630 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:41 np0005539504 nova_compute[187152]: 2025-11-29 06:53:41.631 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:43 np0005539504 nova_compute[187152]: 2025-11-29 06:53:43.790 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:43 np0005539504 nova_compute[187152]: 2025-11-29 06:53:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:43 np0005539504 nova_compute[187152]: 2025-11-29 06:53:43.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:53:44 np0005539504 nova_compute[187152]: 2025-11-29 06:53:44.085 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:44 np0005539504 nova_compute[187152]: 2025-11-29 06:53:44.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:44 np0005539504 nova_compute[187152]: 2025-11-29 06:53:44.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:53:44 np0005539504 nova_compute[187152]: 2025-11-29 06:53:44.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:53:45 np0005539504 nova_compute[187152]: 2025-11-29 06:53:45.209 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:45 np0005539504 nova_compute[187152]: 2025-11-29 06:53:45.210 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:45 np0005539504 nova_compute[187152]: 2025-11-29 06:53:45.210 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:53:45 np0005539504 nova_compute[187152]: 2025-11-29 06:53:45.210 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:46 np0005539504 podman[217557]: 2025-11-29 06:53:46.754162176 +0000 UTC m=+0.086543931 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:53:47 np0005539504 nova_compute[187152]: 2025-11-29 06:53:47.812 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:47 np0005539504 nova_compute[187152]: 2025-11-29 06:53:47.835 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:47 np0005539504 nova_compute[187152]: 2025-11-29 06:53:47.836 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:53:47 np0005539504 nova_compute[187152]: 2025-11-29 06:53:47.836 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:48 np0005539504 nova_compute[187152]: 2025-11-29 06:53:48.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:49 np0005539504 nova_compute[187152]: 2025-11-29 06:53:49.086 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:50 np0005539504 podman[217577]: 2025-11-29 06:53:50.706348524 +0000 UTC m=+0.049163530 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:53:50 np0005539504 podman[217578]: 2025-11-29 06:53:50.727793274 +0000 UTC m=+0.063549419 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64)
Nov 29 01:53:53 np0005539504 nova_compute[187152]: 2025-11-29 06:53:53.794 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:54 np0005539504 nova_compute[187152]: 2025-11-29 06:53:54.145 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.633 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "2213bf07-673d-465e-aa69-032b0bdc9ff2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.634 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2213bf07-673d-465e-aa69-032b0bdc9ff2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.658 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.796 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.800 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.800 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.808 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.808 187156 INFO nova.compute.claims [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.952 187156 DEBUG nova.compute.provider_tree [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:58 np0005539504 nova_compute[187152]: 2025-11-29 06:53:58.988 187156 DEBUG nova.scheduler.client.report [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.021 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.021 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.086 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.087 187156 DEBUG nova.network.neutron [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.107 187156 INFO nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.137 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.148 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.323 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.325 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.325 187156 INFO nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Creating image(s)#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.326 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "/var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.327 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "/var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.328 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "/var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.347 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.387 187156 DEBUG nova.network.neutron [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.388 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.404 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.404 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.405 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.415 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.482 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.483 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.523 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.524 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.524 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.589 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.590 187156 DEBUG nova.virt.disk.api [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Checking if we can resize image /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.590 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.646 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.647 187156 DEBUG nova.virt.disk.api [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Cannot resize image /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.647 187156 DEBUG nova.objects.instance [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lazy-loading 'migration_context' on Instance uuid 2213bf07-673d-465e-aa69-032b0bdc9ff2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.666 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.666 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Ensure instance console log exists: /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.667 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.667 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.667 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.669 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.673 187156 WARNING nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.679 187156 DEBUG nova.virt.libvirt.host [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.679 187156 DEBUG nova.virt.libvirt.host [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.683 187156 DEBUG nova.virt.libvirt.host [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.683 187156 DEBUG nova.virt.libvirt.host [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.684 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.685 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.685 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.685 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.685 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.685 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.685 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.686 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.686 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.686 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.686 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.686 187156 DEBUG nova.virt.hardware [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.690 187156 DEBUG nova.objects.instance [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lazy-loading 'pci_devices' on Instance uuid 2213bf07-673d-465e-aa69-032b0bdc9ff2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.719 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <uuid>2213bf07-673d-465e-aa69-032b0bdc9ff2</uuid>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <name>instance-0000001c</name>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:name>tempest-ListImageFiltersTestJSON-server-2097127586</nova:name>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:53:59</nova:creationTime>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:user uuid="c480a3bf2f8f485889154b20872eada2">tempest-ListImageFiltersTestJSON-573297769-project-member</nova:user>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:        <nova:project uuid="34d1587b0fbb4c3cbf3c8c4a71d1a6be">tempest-ListImageFiltersTestJSON-573297769</nova:project>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <entry name="serial">2213bf07-673d-465e-aa69-032b0bdc9ff2</entry>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <entry name="uuid">2213bf07-673d-465e-aa69-032b0bdc9ff2</entry>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.config"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/console.log" append="off"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:53:59 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:53:59 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:53:59 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:53:59 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.777 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.777 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:53:59 np0005539504 nova_compute[187152]: 2025-11-29 06:53:59.778 187156 INFO nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Using config drive#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.022 187156 INFO nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Creating config drive at /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.config#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.027 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4n33aim2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.151 187156 DEBUG oslo_concurrency.processutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4n33aim2" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:00 np0005539504 systemd-machined[153423]: New machine qemu-15-instance-0000001c.
Nov 29 01:54:00 np0005539504 systemd[1]: Started Virtual Machine qemu-15-instance-0000001c.
Nov 29 01:54:00 np0005539504 podman[217649]: 2025-11-29 06:54:00.260494847 +0000 UTC m=+0.050892128 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:54:00 np0005539504 podman[217650]: 2025-11-29 06:54:00.296762867 +0000 UTC m=+0.084572648 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.764 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399240.7636073, 2213bf07-673d-465e-aa69-032b0bdc9ff2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.764 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.767 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.768 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.771 187156 INFO nova.virt.libvirt.driver [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Instance spawned successfully.#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.771 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.786 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.791 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.800 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.800 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.801 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.801 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.801 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.802 187156 DEBUG nova.virt.libvirt.driver [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.807 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.808 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399240.7668195, 2213bf07-673d-465e-aa69-032b0bdc9ff2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.808 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] VM Started (Lifecycle Event)#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.830 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.833 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.874 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.905 187156 INFO nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Took 1.58 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:54:00 np0005539504 nova_compute[187152]: 2025-11-29 06:54:00.906 187156 DEBUG nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:01 np0005539504 nova_compute[187152]: 2025-11-29 06:54:01.016 187156 INFO nova.compute.manager [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Took 2.30 seconds to build instance.#033[00m
Nov 29 01:54:01 np0005539504 nova_compute[187152]: 2025-11-29 06:54:01.046 187156 DEBUG oslo_concurrency.lockutils [None req-f92c8638-c1a5-4ca9-9d62-32a02ea821c1 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2213bf07-673d-465e-aa69-032b0bdc9ff2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:03 np0005539504 nova_compute[187152]: 2025-11-29 06:54:03.798 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:04 np0005539504 nova_compute[187152]: 2025-11-29 06:54:04.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:04 np0005539504 nova_compute[187152]: 2025-11-29 06:54:04.320 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Creating tmpfile /var/lib/nova/instances/tmpkxgr55ve to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 01:54:04 np0005539504 nova_compute[187152]: 2025-11-29 06:54:04.321 187156 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 01:54:04 np0005539504 podman[217715]: 2025-11-29 06:54:04.716806898 +0000 UTC m=+0.057838224 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:54:05 np0005539504 nova_compute[187152]: 2025-11-29 06:54:05.875 187156 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 01:54:05 np0005539504 nova_compute[187152]: 2025-11-29 06:54:05.906 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:05 np0005539504 nova_compute[187152]: 2025-11-29 06:54:05.907 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquired lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:05 np0005539504 nova_compute[187152]: 2025-11-29 06:54:05.907 187156 DEBUG nova.network.neutron [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.277 187156 DEBUG nova.compute.manager [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.346 187156 INFO nova.compute.manager [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] instance snapshotting#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.538 187156 DEBUG nova.network.neutron [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating instance_info_cache with network_info: [{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.565 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Releasing lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.583 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.584 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Creating instance directory: /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.585 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Creating disk.info with the contents: {'/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk': 'qcow2', '/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.585 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.586 187156 DEBUG nova.objects.instance [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.618 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.672 187156 INFO nova.virt.libvirt.driver [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Beginning live snapshot process#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.701 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.702 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.702 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.714 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.775 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.776 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.811 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.813 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.814 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.871 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.872 187156 DEBUG nova.virt.disk.api [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Checking if we can resize image /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.873 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.912 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.928 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.929 187156 DEBUG nova.virt.disk.api [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Cannot resize image /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.930 187156 DEBUG nova.objects.instance [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'migration_context' on Instance uuid 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.945 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.969 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.971 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.997 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config 485376" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:07 np0005539504 nova_compute[187152]: 2025-11-29 06:54:07.999 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config to /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.000 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.037 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.053 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.117 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.118 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpiisxzxod/14205b885df7461bb2e5d8d17533cc37.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.153 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpiisxzxod/14205b885df7461bb2e5d8d17533cc37.delta 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.155 187156 INFO nova.virt.libvirt.driver [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.224 187156 DEBUG nova.virt.libvirt.guest [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.228 187156 INFO nova.virt.libvirt.driver [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.282 187156 DEBUG nova.privsep.utils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.283 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpiisxzxod/14205b885df7461bb2e5d8d17533cc37.delta /var/lib/nova/instances/snapshots/tmpiisxzxod/14205b885df7461bb2e5d8d17533cc37 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.467 187156 DEBUG oslo_concurrency.processutils [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpiisxzxod/14205b885df7461bb2e5d8d17533cc37.delta /var/lib/nova/instances/snapshots/tmpiisxzxod/14205b885df7461bb2e5d8d17533cc37" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.469 187156 INFO nova.virt.libvirt.driver [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.550 187156 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.551 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.552 187156 DEBUG nova.virt.libvirt.vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:53:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1120356955',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1120356955',id=27,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-x246813w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:00Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.552 187156 DEBUG nova.network.os_vif_util [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.554 187156 DEBUG nova.network.os_vif_util [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.554 187156 DEBUG os_vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.555 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.556 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.556 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.561 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.561 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38a0a755-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.562 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap38a0a755-af, col_values=(('external_ids', {'iface-id': '38a0a755-afa1-4c97-9582-82fb739c7e6c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:1a:af', 'vm-uuid': '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.565 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:08 np0005539504 NetworkManager[55210]: <info>  [1764399248.5671] manager: (tap38a0a755-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.568 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.573 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.574 187156 INFO os_vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af')#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.575 187156 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 01:54:08 np0005539504 nova_compute[187152]: 2025-11-29 06:54:08.575 187156 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 01:54:09 np0005539504 nova_compute[187152]: 2025-11-29 06:54:09.151 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:10 np0005539504 podman[217788]: 2025-11-29 06:54:10.743296864 +0000 UTC m=+0.078938697 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:54:10 np0005539504 nova_compute[187152]: 2025-11-29 06:54:10.792 187156 DEBUG nova.network.neutron [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Port 38a0a755-afa1-4c97-9582-82fb739c7e6c updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 01:54:10 np0005539504 nova_compute[187152]: 2025-11-29 06:54:10.810 187156 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 01:54:11 np0005539504 NetworkManager[55210]: <info>  [1764399251.1169] manager: (tap38a0a755-af): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Nov 29 01:54:11 np0005539504 kernel: tap38a0a755-af: entered promiscuous mode
Nov 29 01:54:11 np0005539504 nova_compute[187152]: 2025-11-29 06:54:11.125 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:11Z|00081|binding|INFO|Claiming lport 38a0a755-afa1-4c97-9582-82fb739c7e6c for this additional chassis.
Nov 29 01:54:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:11Z|00082|binding|INFO|38a0a755-afa1-4c97-9582-82fb739c7e6c: Claiming fa:16:3e:dd:1a:af 10.100.0.7
Nov 29 01:54:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:11Z|00083|binding|INFO|Claiming lport c3ae15da-9b95-4494-b01e-202115261f9d for this additional chassis.
Nov 29 01:54:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:11Z|00084|binding|INFO|c3ae15da-9b95-4494-b01e-202115261f9d: Claiming fa:16:3e:02:a9:0c 19.80.0.52
Nov 29 01:54:11 np0005539504 systemd-machined[153423]: New machine qemu-16-instance-0000001b.
Nov 29 01:54:11 np0005539504 systemd-udevd[217825]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:11 np0005539504 nova_compute[187152]: 2025-11-29 06:54:11.184 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:11 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:11Z|00085|binding|INFO|Setting lport 38a0a755-afa1-4c97-9582-82fb739c7e6c ovn-installed in OVS
Nov 29 01:54:11 np0005539504 NetworkManager[55210]: <info>  [1764399251.1885] device (tap38a0a755-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:54:11 np0005539504 NetworkManager[55210]: <info>  [1764399251.1897] device (tap38a0a755-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:54:11 np0005539504 nova_compute[187152]: 2025-11-29 06:54:11.187 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:11 np0005539504 nova_compute[187152]: 2025-11-29 06:54:11.191 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:11 np0005539504 systemd[1]: Started Virtual Machine qemu-16-instance-0000001b.
Nov 29 01:54:11 np0005539504 nova_compute[187152]: 2025-11-29 06:54:11.222 187156 INFO nova.virt.libvirt.driver [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Snapshot image upload complete#033[00m
Nov 29 01:54:11 np0005539504 nova_compute[187152]: 2025-11-29 06:54:11.222 187156 INFO nova.compute.manager [None req-34a2962b-bf89-4eaa-a2aa-a5f3fa047804 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Took 3.86 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 01:54:12 np0005539504 nova_compute[187152]: 2025-11-29 06:54:12.047 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399252.0467854, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:12 np0005539504 nova_compute[187152]: 2025-11-29 06:54:12.048 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Started (Lifecycle Event)#033[00m
Nov 29 01:54:12 np0005539504 nova_compute[187152]: 2025-11-29 06:54:12.071 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:13 np0005539504 nova_compute[187152]: 2025-11-29 06:54:13.563 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:13 np0005539504 nova_compute[187152]: 2025-11-29 06:54:13.774 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399253.7739134, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:13 np0005539504 nova_compute[187152]: 2025-11-29 06:54:13.777 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:54:13 np0005539504 nova_compute[187152]: 2025-11-29 06:54:13.800 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:13 np0005539504 nova_compute[187152]: 2025-11-29 06:54:13.806 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:13 np0005539504 nova_compute[187152]: 2025-11-29 06:54:13.824 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 01:54:14 np0005539504 nova_compute[187152]: 2025-11-29 06:54:14.153 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:14 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:14Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:1a:af 10.100.0.7
Nov 29 01:54:14 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:14Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:1a:af 10.100.0.7
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00086|binding|INFO|Claiming lport 38a0a755-afa1-4c97-9582-82fb739c7e6c for this chassis.
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00087|binding|INFO|38a0a755-afa1-4c97-9582-82fb739c7e6c: Claiming fa:16:3e:dd:1a:af 10.100.0.7
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00088|binding|INFO|Claiming lport c3ae15da-9b95-4494-b01e-202115261f9d for this chassis.
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00089|binding|INFO|c3ae15da-9b95-4494-b01e-202115261f9d: Claiming fa:16:3e:02:a9:0c 19.80.0.52
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00090|binding|INFO|Setting lport 38a0a755-afa1-4c97-9582-82fb739c7e6c up in Southbound
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00091|binding|INFO|Setting lport c3ae15da-9b95-4494-b01e-202115261f9d up in Southbound
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.277 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:1a:af 10.100.0.7'], port_security=['fa:16:3e:dd:1a:af 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2018647846', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2018647846', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '11', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=38a0a755-afa1-4c97-9582-82fb739c7e6c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.280 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:a9:0c 19.80.0.52'], port_security=['fa:16:3e:02:a9:0c 19.80.0.52'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['38a0a755-afa1-4c97-9582-82fb739c7e6c'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-70072018', 'neutron:cidrs': '19.80.0.52/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-70072018', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=69db7c70-3ac5-4b08-99b1-a77caa10cb9e, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3ae15da-9b95-4494-b01e-202115261f9d) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.282 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 38a0a755-afa1-4c97-9582-82fb739c7e6c in datapath c691e2c0-bf24-480c-9af6-236639f0492c bound to our chassis#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.284 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c691e2c0-bf24-480c-9af6-236639f0492c#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.303 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c19362dc-c7dc-4595-bd3d-6647470fd41d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.341 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[73ec18f0-6c80-4667-a66c-68fbe5694022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.345 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd02bb1-2bbd-476a-b220-f95116c18904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.370 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[54fee3a0-99d2-4881-9d8a-3ba65cf6e023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.389 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bb4b333d-d387-445a-b323-af59b6f49fa7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 38, 'tx_packets': 5, 'rx_bytes': 2466, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 38, 'tx_packets': 5, 'rx_bytes': 2466, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467760, 'reachable_time': 25936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 17, 'inoctets': 1256, 'indelivers': 3, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 17, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1256, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 17, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 3, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217885, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.405 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[416c1587-e276-4d62-b512-453ad145b3b5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc691e2c0-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467770, 'tstamp': 467770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217886, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc691e2c0-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467773, 'tstamp': 467773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217886, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.407 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539504 nova_compute[187152]: 2025-11-29 06:54:15.409 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539504 nova_compute[187152]: 2025-11-29 06:54:15.410 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.411 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc691e2c0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.411 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.412 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc691e2c0-b0, col_values=(('external_ids', {'iface-id': 'a88e36d5-5037-4505-8d26-de14faa22faf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.412 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.414 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c3ae15da-9b95-4494-b01e-202115261f9d in datapath 3e6779cb-7f6f-419d-b2e7-0b18b601b6be bound to our chassis#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.415 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e6779cb-7f6f-419d-b2e7-0b18b601b6be#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.425 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6e40dd6d-bdce-4790-9abe-2132aca1bb5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.426 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e6779cb-71 in ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.428 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e6779cb-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.428 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7a75a9ae-031e-4afa-949c-f6964bba3007]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.429 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cb213b4a-bfbe-4a7f-86ea-c3cbe966aa8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.443 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[6d62bbae-1319-4da3-ac85-eec19aaccc20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.456 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[49d226cd-7a76-4e09-aabd-58d5989c7346]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.479 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6d44f0-2037-4d4c-a918-14925c695e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.486 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a93d2907-ad6a-48c4-96ac-8c7a926edbc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 NetworkManager[55210]: <info>  [1764399255.4878] manager: (tap3e6779cb-70): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Nov 29 01:54:15 np0005539504 systemd-udevd[217894]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.522 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[abd75ef7-69f1-4d22-b968-994a263ff9ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.529 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[effc67bc-ea72-4e3c-b5aa-f93316c523ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 NetworkManager[55210]: <info>  [1764399255.5515] device (tap3e6779cb-70): carrier: link connected
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.556 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[83f55bb8-9c88-4df3-9f6c-4fdc9045bd98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.570 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6b112980-d206-4892-b3a6-b2d1f3a29070]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e6779cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:bd:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471843, 'reachable_time': 23191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217913, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.582 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f7111d64-39fa-46b0-84ea-10e70208f30f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:bd1b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 471843, 'tstamp': 471843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217914, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.598 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[27cf40cd-4864-4ba6-bf3c-eef6d08618b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e6779cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:bd:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471843, 'reachable_time': 23191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217915, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.629 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3ff695-628a-4aea-914c-620d24e1a16a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.681 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f08acce8-1781-4c2f-b25d-89b64ce6a729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.682 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e6779cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.683 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.683 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e6779cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539504 nova_compute[187152]: 2025-11-29 06:54:15.684 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539504 NetworkManager[55210]: <info>  [1764399255.6858] manager: (tap3e6779cb-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 29 01:54:15 np0005539504 kernel: tap3e6779cb-70: entered promiscuous mode
Nov 29 01:54:15 np0005539504 nova_compute[187152]: 2025-11-29 06:54:15.687 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.688 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e6779cb-70, col_values=(('external_ids', {'iface-id': '933cdbf7-1588-4a26-b171-b3d2ec3cd1a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539504 nova_compute[187152]: 2025-11-29 06:54:15.689 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:15Z|00092|binding|INFO|Releasing lport 933cdbf7-1588-4a26-b171-b3d2ec3cd1a3 from this chassis (sb_readonly=0)
Nov 29 01:54:15 np0005539504 nova_compute[187152]: 2025-11-29 06:54:15.700 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.701 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.702 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ef17b999-2935-418e-ac61-36dc370c7205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.703 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-3e6779cb-7f6f-419d-b2e7-0b18b601b6be
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.pid.haproxy
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 3e6779cb-7f6f-419d-b2e7-0b18b601b6be
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:54:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:15.703 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'env', 'PROCESS_TAG=haproxy-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:54:16 np0005539504 podman[217948]: 2025-11-29 06:54:16.086866749 +0000 UTC m=+0.050419525 container create ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:16 np0005539504 systemd[1]: Started libpod-conmon-ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec.scope.
Nov 29 01:54:16 np0005539504 podman[217948]: 2025-11-29 06:54:16.060914007 +0000 UTC m=+0.024466783 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:54:16 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:54:16 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/068539accbb094a3e805a2a74e96df0c5b039875922357e955718c85c7f65a12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:54:16 np0005539504 podman[217948]: 2025-11-29 06:54:16.204953102 +0000 UTC m=+0.168505908 container init ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:54:16 np0005539504 podman[217948]: 2025-11-29 06:54:16.212460816 +0000 UTC m=+0.176013592 container start ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:16 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [NOTICE]   (217967) : New worker (217969) forked
Nov 29 01:54:16 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [NOTICE]   (217967) : Loading success.
Nov 29 01:54:16 np0005539504 nova_compute[187152]: 2025-11-29 06:54:16.382 187156 INFO nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Post operation of migration started#033[00m
Nov 29 01:54:16 np0005539504 nova_compute[187152]: 2025-11-29 06:54:16.820 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:16 np0005539504 nova_compute[187152]: 2025-11-29 06:54:16.820 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquired lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:16 np0005539504 nova_compute[187152]: 2025-11-29 06:54:16.821 187156 DEBUG nova.network.neutron [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:54:17 np0005539504 podman[217978]: 2025-11-29 06:54:17.713644982 +0000 UTC m=+0.059581073 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.061 187156 DEBUG nova.network.neutron [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating instance_info_cache with network_info: [{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.093 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Releasing lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.132 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.133 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.133 187156 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.138 187156 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 01:54:18 np0005539504 virtqemud[186569]: Domain id=16 name='instance-0000001b' uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 is tainted: custom-monitor
Nov 29 01:54:18 np0005539504 nova_compute[187152]: 2025-11-29 06:54:18.565 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:19 np0005539504 nova_compute[187152]: 2025-11-29 06:54:19.147 187156 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 01:54:19 np0005539504 nova_compute[187152]: 2025-11-29 06:54:19.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:19 np0005539504 nova_compute[187152]: 2025-11-29 06:54:19.824 187156 DEBUG nova.compute.manager [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:19 np0005539504 nova_compute[187152]: 2025-11-29 06:54:19.904 187156 INFO nova.compute.manager [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] instance snapshotting#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.154 187156 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.162 187156 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.165 187156 INFO nova.virt.libvirt.driver [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Beginning live snapshot process#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.199 187156 DEBUG nova.objects.instance [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:54:20 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.428 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.488 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.490 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.551 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:20 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.564 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:21 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.621 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:21 np0005539504 nova_compute[187152]: 2025-11-29 06:54:20.622 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpdoje7mpm/ff9e085307414e20855694d4b525c3fa.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:21 np0005539504 nova_compute[187152]: 2025-11-29 06:54:21.260 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpdoje7mpm/ff9e085307414e20855694d4b525c3fa.delta 1073741824" returned: 0 in 0.638s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:21 np0005539504 nova_compute[187152]: 2025-11-29 06:54:21.261 187156 INFO nova.virt.libvirt.driver [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 01:54:21 np0005539504 nova_compute[187152]: 2025-11-29 06:54:21.344 187156 DEBUG nova.virt.libvirt.guest [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 01:54:21 np0005539504 podman[218010]: 2025-11-29 06:54:21.401187294 +0000 UTC m=+0.093937362 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:54:21 np0005539504 podman[218011]: 2025-11-29 06:54:21.409702795 +0000 UTC m=+0.101135317 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 29 01:54:21 np0005539504 nova_compute[187152]: 2025-11-29 06:54:21.848 187156 DEBUG nova.virt.libvirt.guest [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] COPY block job progress, current cursor: 53215232 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 01:54:22 np0005539504 nova_compute[187152]: 2025-11-29 06:54:22.352 187156 DEBUG nova.virt.libvirt.guest [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 01:54:22 np0005539504 nova_compute[187152]: 2025-11-29 06:54:22.356 187156 INFO nova.virt.libvirt.driver [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 01:54:22 np0005539504 nova_compute[187152]: 2025-11-29 06:54:22.406 187156 DEBUG nova.privsep.utils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:54:22 np0005539504 nova_compute[187152]: 2025-11-29 06:54:22.407 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpdoje7mpm/ff9e085307414e20855694d4b525c3fa.delta /var/lib/nova/instances/snapshots/tmpdoje7mpm/ff9e085307414e20855694d4b525c3fa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:22 np0005539504 nova_compute[187152]: 2025-11-29 06:54:22.851 187156 DEBUG oslo_concurrency.processutils [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpdoje7mpm/ff9e085307414e20855694d4b525c3fa.delta /var/lib/nova/instances/snapshots/tmpdoje7mpm/ff9e085307414e20855694d4b525c3fa" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:22 np0005539504 nova_compute[187152]: 2025-11-29 06:54:22.859 187156 INFO nova.virt.libvirt.driver [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:54:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:22.909 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:22.910 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:22.911 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:23 np0005539504 nova_compute[187152]: 2025-11-29 06:54:23.567 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.177 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.313 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.314 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.315 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.811 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.811 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.812 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.812 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.812 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.829 187156 INFO nova.compute.manager [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Terminating instance#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.872 187156 DEBUG nova.compute.manager [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:54:24 np0005539504 kernel: tap38a0a755-af (unregistering): left promiscuous mode
Nov 29 01:54:24 np0005539504 NetworkManager[55210]: <info>  [1764399264.8967] device (tap38a0a755-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00093|binding|INFO|Releasing lport 38a0a755-afa1-4c97-9582-82fb739c7e6c from this chassis (sb_readonly=0)
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00094|binding|INFO|Setting lport 38a0a755-afa1-4c97-9582-82fb739c7e6c down in Southbound
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00095|binding|INFO|Releasing lport c3ae15da-9b95-4494-b01e-202115261f9d from this chassis (sb_readonly=0)
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00096|binding|INFO|Setting lport c3ae15da-9b95-4494-b01e-202115261f9d down in Southbound
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00097|binding|INFO|Removing iface tap38a0a755-af ovn-installed in OVS
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00098|binding|INFO|Releasing lport 933cdbf7-1588-4a26-b171-b3d2ec3cd1a3 from this chassis (sb_readonly=0)
Nov 29 01:54:24 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:24Z|00099|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.921 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:1a:af 10.100.0.7'], port_security=['fa:16:3e:dd:1a:af 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2018647846', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2018647846', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '11', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=38a0a755-afa1-4c97-9582-82fb739c7e6c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.929 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:a9:0c 19.80.0.52'], port_security=['fa:16:3e:02:a9:0c 19.80.0.52'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['38a0a755-afa1-4c97-9582-82fb739c7e6c'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-70072018', 'neutron:cidrs': '19.80.0.52/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-70072018', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=69db7c70-3ac5-4b08-99b1-a77caa10cb9e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3ae15da-9b95-4494-b01e-202115261f9d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.931 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 38a0a755-afa1-4c97-9582-82fb739c7e6c in datapath c691e2c0-bf24-480c-9af6-236639f0492c unbound from our chassis#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.933 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.934 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c691e2c0-bf24-480c-9af6-236639f0492c#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.951 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ece6098b-5af7-4944-9f42-77f91c824376]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:24 np0005539504 nova_compute[187152]: 2025-11-29 06:54:24.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539504 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Nov 29 01:54:24 np0005539504 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001b.scope: Consumed 2.903s CPU time.
Nov 29 01:54:24 np0005539504 systemd-machined[153423]: Machine qemu-16-instance-0000001b terminated.
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.980 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f7ff92-3958-4638-afad-556c03df8d74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:24.984 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0aabe8f7-0bff-487b-930e-5cf3c09c9e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.009 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[39ab74a2-681b-4cf7-8408-0484d0d71725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.026 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[54a61073-bc08-492a-8e37-db4335039d67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 51, 'tx_packets': 7, 'rx_bytes': 3012, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 51, 'tx_packets': 7, 'rx_bytes': 3012, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 5, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467760, 'reachable_time': 25936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 17, 'inoctets': 1256, 'indelivers': 3, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 17, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1256, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 17, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 3, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218083, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.039 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b95a0981-6e75-4988-be2e-b028a6774921]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc691e2c0-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467770, 'tstamp': 467770}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218084, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc691e2c0-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467773, 'tstamp': 467773}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218084, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.040 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.042 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.046 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.046 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc691e2c0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.047 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.047 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc691e2c0-b0, col_values=(('external_ids', {'iface-id': 'a88e36d5-5037-4505-8d26-de14faa22faf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.048 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.049 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c3ae15da-9b95-4494-b01e-202115261f9d in datapath 3e6779cb-7f6f-419d-b2e7-0b18b601b6be unbound from our chassis#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.050 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e6779cb-7f6f-419d-b2e7-0b18b601b6be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.051 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c376a51-3a1e-4159-9ea4-307c965f9064]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.051 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be namespace which is not needed anymore#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.090 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.094 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.127 187156 INFO nova.virt.libvirt.driver [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Instance destroyed successfully.#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.128 187156 DEBUG nova.objects.instance [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lazy-loading 'resources' on Instance uuid 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.144 187156 DEBUG nova.virt.libvirt.vif [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:53:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1120356955',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1120356955',id=27,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-x246813w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:20Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.144 187156 DEBUG nova.network.os_vif_util [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converting VIF {"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.145 187156 DEBUG nova.network.os_vif_util [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.145 187156 DEBUG os_vif [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.148 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.148 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38a0a755-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.151 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.153 187156 INFO os_vif [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af')#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.153 187156 INFO nova.virt.libvirt.driver [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Deleting instance files /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6_del#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.154 187156 INFO nova.virt.libvirt.driver [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Deletion of /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6_del complete#033[00m
Nov 29 01:54:25 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [NOTICE]   (217967) : haproxy version is 2.8.14-c23fe91
Nov 29 01:54:25 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [NOTICE]   (217967) : path to executable is /usr/sbin/haproxy
Nov 29 01:54:25 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [WARNING]  (217967) : Exiting Master process...
Nov 29 01:54:25 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [WARNING]  (217967) : Exiting Master process...
Nov 29 01:54:25 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [ALERT]    (217967) : Current worker (217969) exited with code 143 (Terminated)
Nov 29 01:54:25 np0005539504 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[217963]: [WARNING]  (217967) : All workers exited. Exiting... (0)
Nov 29 01:54:25 np0005539504 systemd[1]: libpod-ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec.scope: Deactivated successfully.
Nov 29 01:54:25 np0005539504 podman[218119]: 2025-11-29 06:54:25.179018488 +0000 UTC m=+0.043194359 container died ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:25 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec-userdata-shm.mount: Deactivated successfully.
Nov 29 01:54:25 np0005539504 systemd[1]: var-lib-containers-storage-overlay-068539accbb094a3e805a2a74e96df0c5b039875922357e955718c85c7f65a12-merged.mount: Deactivated successfully.
Nov 29 01:54:25 np0005539504 podman[218119]: 2025-11-29 06:54:25.220634323 +0000 UTC m=+0.084810204 container cleanup ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:54:25 np0005539504 systemd[1]: libpod-conmon-ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec.scope: Deactivated successfully.
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.230 187156 INFO nova.compute.manager [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.230 187156 DEBUG oslo.service.loopingcall [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.231 187156 DEBUG nova.compute.manager [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.231 187156 DEBUG nova.network.neutron [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:54:25 np0005539504 podman[218148]: 2025-11-29 06:54:25.28558785 +0000 UTC m=+0.042168741 container remove ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.289 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c697d07a-4df4-4565-b623-a202e48c6484]: (4, ('Sat Nov 29 06:54:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be (ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec)\nceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec\nSat Nov 29 06:54:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be (ceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec)\nceb393565dc7c85ae1f48195a5368f935a5b7f8cfacb6aa3d210eccd41c055ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.291 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1df48a42-9d9a-4585-9d23-174f451b9568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.292 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e6779cb-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.338 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 kernel: tap3e6779cb-70: left promiscuous mode
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.350 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.351 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.353 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[92d28b2b-7658-4d70-a9aa-26b10f5592cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.373 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f09121cc-7910-4f4a-a56d-15ae563aba38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.374 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c3beaa71-f155-4fd0-b266-73864ec4f36f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.388 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfca56c-1433-4be2-b67b-71eb8c1c77d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471835, 'reachable_time': 37592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218163, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.391 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:54:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:25.391 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[37aeed63-fd13-4542-8c52-57908504007f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:25 np0005539504 systemd[1]: run-netns-ovnmeta\x2d3e6779cb\x2d7f6f\x2d419d\x2db2e7\x2d0b18b601b6be.mount: Deactivated successfully.
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.454 187156 DEBUG nova.compute.manager [req-20ffeb34-7886-44bd-88a7-e92255f68789 req-17581ce9-dbd9-4ce5-b5c2-fa2349722cce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.455 187156 DEBUG oslo_concurrency.lockutils [req-20ffeb34-7886-44bd-88a7-e92255f68789 req-17581ce9-dbd9-4ce5-b5c2-fa2349722cce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.455 187156 DEBUG oslo_concurrency.lockutils [req-20ffeb34-7886-44bd-88a7-e92255f68789 req-17581ce9-dbd9-4ce5-b5c2-fa2349722cce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.455 187156 DEBUG oslo_concurrency.lockutils [req-20ffeb34-7886-44bd-88a7-e92255f68789 req-17581ce9-dbd9-4ce5-b5c2-fa2349722cce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.455 187156 DEBUG nova.compute.manager [req-20ffeb34-7886-44bd-88a7-e92255f68789 req-17581ce9-dbd9-4ce5-b5c2-fa2349722cce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:25 np0005539504 nova_compute[187152]: 2025-11-29 06:54:25.456 187156 DEBUG nova.compute.manager [req-20ffeb34-7886-44bd-88a7-e92255f68789 req-17581ce9-dbd9-4ce5-b5c2-fa2349722cce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:54:26 np0005539504 nova_compute[187152]: 2025-11-29 06:54:26.363 187156 INFO nova.virt.libvirt.driver [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Snapshot image upload complete#033[00m
Nov 29 01:54:26 np0005539504 nova_compute[187152]: 2025-11-29 06:54:26.364 187156 INFO nova.compute.manager [None req-cdff4a29-a6f4-4bf7-a0a4-879f087620c8 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Took 6.45 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 01:54:26 np0005539504 nova_compute[187152]: 2025-11-29 06:54:26.911 187156 DEBUG nova.network.neutron [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:26 np0005539504 nova_compute[187152]: 2025-11-29 06:54:26.945 187156 INFO nova.compute.manager [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Took 1.71 seconds to deallocate network for instance.#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.069 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.069 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.074 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.102 187156 INFO nova.scheduler.client.report [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Deleted allocations for instance 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.173 187156 DEBUG oslo_concurrency.lockutils [None req-389bcec5-2b4a-4389-a8d8-8fb9e687e62c ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.526 187156 DEBUG nova.compute.manager [req-8ea1ebe7-b594-403e-a8c1-2c77fb8272e7 req-46a2529d-3020-40fc-940c-6e2e7f6a9ca2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.526 187156 DEBUG oslo_concurrency.lockutils [req-8ea1ebe7-b594-403e-a8c1-2c77fb8272e7 req-46a2529d-3020-40fc-940c-6e2e7f6a9ca2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.527 187156 DEBUG oslo_concurrency.lockutils [req-8ea1ebe7-b594-403e-a8c1-2c77fb8272e7 req-46a2529d-3020-40fc-940c-6e2e7f6a9ca2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.528 187156 DEBUG oslo_concurrency.lockutils [req-8ea1ebe7-b594-403e-a8c1-2c77fb8272e7 req-46a2529d-3020-40fc-940c-6e2e7f6a9ca2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.528 187156 DEBUG nova.compute.manager [req-8ea1ebe7-b594-403e-a8c1-2c77fb8272e7 req-46a2529d-3020-40fc-940c-6e2e7f6a9ca2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:27 np0005539504 nova_compute[187152]: 2025-11-29 06:54:27.528 187156 WARNING nova.compute.manager [req-8ea1ebe7-b594-403e-a8c1-2c77fb8272e7 req-46a2529d-3020-40fc-940c-6e2e7f6a9ca2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received unexpected event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:54:29 np0005539504 nova_compute[187152]: 2025-11-29 06:54:29.180 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:30 np0005539504 nova_compute[187152]: 2025-11-29 06:54:30.167 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:30 np0005539504 podman[218165]: 2025-11-29 06:54:30.70879708 +0000 UTC m=+0.053214810 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:54:30 np0005539504 podman[218166]: 2025-11-29 06:54:30.734291079 +0000 UTC m=+0.076282843 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.687 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.687 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.688 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.688 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.688 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.699 187156 INFO nova.compute.manager [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Terminating instance#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.710 187156 DEBUG nova.compute.manager [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:54:31 np0005539504 kernel: tap1ff22547-58 (unregistering): left promiscuous mode
Nov 29 01:54:31 np0005539504 NetworkManager[55210]: <info>  [1764399271.7364] device (tap1ff22547-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.743 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:31 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:31Z|00100|binding|INFO|Releasing lport 1ff22547-5892-4360-8abe-429ea2f212ee from this chassis (sb_readonly=0)
Nov 29 01:54:31 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:31Z|00101|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee down in Southbound
Nov 29 01:54:31 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:31Z|00102|binding|INFO|Removing iface tap1ff22547-58 ovn-installed in OVS
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.747 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:31.754 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:a1 10.100.0.5'], port_security=['fa:16:3e:56:a7:a1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2e380200-8276-4470-965f-31baa0bfd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '22', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1ff22547-5892-4360-8abe-429ea2f212ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:31.755 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1ff22547-5892-4360-8abe-429ea2f212ee in datapath c691e2c0-bf24-480c-9af6-236639f0492c unbound from our chassis#033[00m
Nov 29 01:54:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:31.756 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c691e2c0-bf24-480c-9af6-236639f0492c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:54:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:31.758 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[56ffc021-b967-493b-b64f-782e6cf1d558]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:31.758 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace which is not needed anymore#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.759 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:31 np0005539504 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 29 01:54:31 np0005539504 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000017.scope: Consumed 4.475s CPU time.
Nov 29 01:54:31 np0005539504 systemd-machined[153423]: Machine qemu-14-instance-00000017 terminated.
Nov 29 01:54:31 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [NOTICE]   (217519) : haproxy version is 2.8.14-c23fe91
Nov 29 01:54:31 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [NOTICE]   (217519) : path to executable is /usr/sbin/haproxy
Nov 29 01:54:31 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [WARNING]  (217519) : Exiting Master process...
Nov 29 01:54:31 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [ALERT]    (217519) : Current worker (217521) exited with code 143 (Terminated)
Nov 29 01:54:31 np0005539504 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217515]: [WARNING]  (217519) : All workers exited. Exiting... (0)
Nov 29 01:54:31 np0005539504 systemd[1]: libpod-32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76.scope: Deactivated successfully.
Nov 29 01:54:31 np0005539504 podman[218237]: 2025-11-29 06:54:31.93278372 +0000 UTC m=+0.064758972 container died 32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:54:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76-userdata-shm.mount: Deactivated successfully.
Nov 29 01:54:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay-ad5ab4155148d837a58dda34f70edbc45147108a88c310dbfcd769d4a8cd4ff7-merged.mount: Deactivated successfully.
Nov 29 01:54:31 np0005539504 podman[218237]: 2025-11-29 06:54:31.98379313 +0000 UTC m=+0.115768412 container cleanup 32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.988 187156 INFO nova.virt.libvirt.driver [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance destroyed successfully.#033[00m
Nov 29 01:54:31 np0005539504 nova_compute[187152]: 2025-11-29 06:54:31.990 187156 DEBUG nova.objects.instance [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lazy-loading 'resources' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:31 np0005539504 systemd[1]: libpod-conmon-32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76.scope: Deactivated successfully.
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.008 187156 DEBUG nova.virt.libvirt.vif [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:39Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.009 187156 DEBUG nova.network.os_vif_util [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.010 187156 DEBUG nova.network.os_vif_util [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.011 187156 DEBUG os_vif [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.013 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.013 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ff22547-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.015 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.021 187156 INFO os_vif [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58')#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.022 187156 INFO nova.virt.libvirt.driver [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deleting instance files /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760_del#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.022 187156 INFO nova.virt.libvirt.driver [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deletion of /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760_del complete#033[00m
Nov 29 01:54:32 np0005539504 podman[218281]: 2025-11-29 06:54:32.053285849 +0000 UTC m=+0.045764708 container remove 32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.060 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b768d4fd-2bf1-4322-bb9d-ee2a52460c9f]: (4, ('Sat Nov 29 06:54:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76)\n32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76\nSat Nov 29 06:54:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76)\n32e6162af8c0971884cca0fdd2bca2e35bc4be21b496358c026824378c5c8b76\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.062 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[90e03099-4b71-4182-b275-3b7a38ac6433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.063 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.065 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:32 np0005539504 kernel: tapc691e2c0-b0: left promiscuous mode
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.083 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c305d446-defa-4c82-9846-78033940211a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.100 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[df292418-ba13-435e-b64a-397d37177179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.100 187156 INFO nova.compute.manager [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.101 187156 DEBUG oslo.service.loopingcall [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.101 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[573ee60f-2515-46fb-bbdb-62324f5625aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.102 187156 DEBUG nova.compute.manager [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.103 187156 DEBUG nova.network.neutron [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.120 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b7eb637f-3c16-4e91-8ffe-3f3f679005d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467752, 'reachable_time': 40998, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218296, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.124 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:54:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:32.124 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[77bf3add-d8b8-4329-8444-9d695a735921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:32 np0005539504 systemd[1]: run-netns-ovnmeta\x2dc691e2c0\x2dbf24\x2d480c\x2d9af6\x2d236639f0492c.mount: Deactivated successfully.
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.422 187156 DEBUG nova.compute.manager [req-5d8367c6-99f1-488d-a072-47e42bc8fd2b req-16ca4399-9c61-4957-a6cc-24ed663a15e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.422 187156 DEBUG oslo_concurrency.lockutils [req-5d8367c6-99f1-488d-a072-47e42bc8fd2b req-16ca4399-9c61-4957-a6cc-24ed663a15e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.423 187156 DEBUG oslo_concurrency.lockutils [req-5d8367c6-99f1-488d-a072-47e42bc8fd2b req-16ca4399-9c61-4957-a6cc-24ed663a15e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.423 187156 DEBUG oslo_concurrency.lockutils [req-5d8367c6-99f1-488d-a072-47e42bc8fd2b req-16ca4399-9c61-4957-a6cc-24ed663a15e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.423 187156 DEBUG nova.compute.manager [req-5d8367c6-99f1-488d-a072-47e42bc8fd2b req-16ca4399-9c61-4957-a6cc-24ed663a15e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:32 np0005539504 nova_compute[187152]: 2025-11-29 06:54:32.423 187156 DEBUG nova.compute.manager [req-5d8367c6-99f1-488d-a072-47e42bc8fd2b req-16ca4399-9c61-4957-a6cc-24ed663a15e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.209 187156 DEBUG nova.network.neutron [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.232 187156 INFO nova.compute.manager [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Took 1.13 seconds to deallocate network for instance.#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.279 187156 DEBUG nova.compute.manager [req-7cc8754e-34a3-40b7-b94e-ace4017eefc7 req-e6f974c4-918a-48a0-a74a-710eb99ace15 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-deleted-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.304 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.304 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:33.316 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.431 187156 DEBUG nova.compute.provider_tree [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.449 187156 DEBUG nova.scheduler.client.report [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.472 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.498 187156 INFO nova.scheduler.client.report [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Deleted allocations for instance 2e380200-8276-4470-965f-31baa0bfd760#033[00m
Nov 29 01:54:33 np0005539504 nova_compute[187152]: 2025-11-29 06:54:33.585 187156 DEBUG oslo_concurrency.lockutils [None req-978e8e1f-026f-4dff-abac-3c736b4746e5 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.182 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.628 187156 DEBUG nova.compute.manager [req-d2210e6f-419e-4651-bbcb-6362b81ee49a req-fdb9820c-6f1f-4e82-b5b9-0c8f42a12d97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.629 187156 DEBUG oslo_concurrency.lockutils [req-d2210e6f-419e-4651-bbcb-6362b81ee49a req-fdb9820c-6f1f-4e82-b5b9-0c8f42a12d97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.629 187156 DEBUG oslo_concurrency.lockutils [req-d2210e6f-419e-4651-bbcb-6362b81ee49a req-fdb9820c-6f1f-4e82-b5b9-0c8f42a12d97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.629 187156 DEBUG oslo_concurrency.lockutils [req-d2210e6f-419e-4651-bbcb-6362b81ee49a req-fdb9820c-6f1f-4e82-b5b9-0c8f42a12d97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.630 187156 DEBUG nova.compute.manager [req-d2210e6f-419e-4651-bbcb-6362b81ee49a req-fdb9820c-6f1f-4e82-b5b9-0c8f42a12d97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:34 np0005539504 nova_compute[187152]: 2025-11-29 06:54:34.630 187156 WARNING nova.compute.manager [req-d2210e6f-419e-4651-bbcb-6362b81ee49a req-fdb9820c-6f1f-4e82-b5b9-0c8f42a12d97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:54:35 np0005539504 podman[218297]: 2025-11-29 06:54:35.756209946 +0000 UTC m=+0.093328356 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.249 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.249 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.268 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.534 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.535 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.547 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.548 187156 INFO nova.compute.claims [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.725 187156 DEBUG nova.compute.provider_tree [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.793 187156 DEBUG nova.scheduler.client.report [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.827 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.829 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.945 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.945 187156 DEBUG nova.network.neutron [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:54:36 np0005539504 nova_compute[187152]: 2025-11-29 06:54:36.962 187156 INFO nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.011 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.016 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.130 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.132 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.133 187156 INFO nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Creating image(s)#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.134 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "/var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.135 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.136 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.136 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "3fec8ad6044bc82e00912202f90e71c49ea3f925" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.138 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "3fec8ad6044bc82e00912202f90e71c49ea3f925" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.191 187156 DEBUG nova.policy [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:54:37 np0005539504 nova_compute[187152]: 2025-11-29 06:54:37.703 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.062 187156 DEBUG nova.network.neutron [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Successfully created port: 60cb0054-846e-4b01-967b-f0041958e737 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.325 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.408 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.part --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.411 187156 DEBUG nova.virt.images [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] 087ac09b-91ea-4cd3-b62e-842ca8fac86a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.413 187156 DEBUG nova.privsep.utils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.413 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.part /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.574 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.part /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.converted" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.580 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.648 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925.converted --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.649 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "3fec8ad6044bc82e00912202f90e71c49ea3f925" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.663 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.731 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.732 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "3fec8ad6044bc82e00912202f90e71c49ea3f925" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.733 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "3fec8ad6044bc82e00912202f90e71c49ea3f925" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.743 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.800 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.801 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925,backing_fmt=raw /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:38 np0005539504 nova_compute[187152]: 2025-11-29 06:54:38.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.184 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.268 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925,backing_fmt=raw /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk 1073741824" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.270 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "3fec8ad6044bc82e00912202f90e71c49ea3f925" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.271 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.303 187156 DEBUG nova.network.neutron [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Successfully updated port: 60cb0054-846e-4b01-967b-f0041958e737 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.320 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "refresh_cache-b108f341-92ea-4b22-91ee-0d1001229908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.321 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquired lock "refresh_cache-b108f341-92ea-4b22-91ee-0d1001229908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.321 187156 DEBUG nova.network.neutron [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.337 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.338 187156 DEBUG nova.objects.instance [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'migration_context' on Instance uuid b108f341-92ea-4b22-91ee-0d1001229908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.353 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.353 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Ensure instance console log exists: /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.354 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.354 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.355 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.406 187156 DEBUG nova.compute.manager [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-changed-60cb0054-846e-4b01-967b-f0041958e737 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.407 187156 DEBUG nova.compute.manager [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Refreshing instance network info cache due to event network-changed-60cb0054-846e-4b01-967b-f0041958e737. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.407 187156 DEBUG oslo_concurrency.lockutils [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b108f341-92ea-4b22-91ee-0d1001229908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.947 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "2213bf07-673d-465e-aa69-032b0bdc9ff2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.948 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2213bf07-673d-465e-aa69-032b0bdc9ff2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.948 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "2213bf07-673d-465e-aa69-032b0bdc9ff2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.948 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2213bf07-673d-465e-aa69-032b0bdc9ff2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.948 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2213bf07-673d-465e-aa69-032b0bdc9ff2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.962 187156 DEBUG nova.network.neutron [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.966 187156 INFO nova.compute.manager [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Terminating instance#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.978 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "refresh_cache-2213bf07-673d-465e-aa69-032b0bdc9ff2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.979 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquired lock "refresh_cache-2213bf07-673d-465e-aa69-032b0bdc9ff2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:39 np0005539504 nova_compute[187152]: 2025-11-29 06:54:39.979 187156 DEBUG nova.network.neutron [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:54:40 np0005539504 nova_compute[187152]: 2025-11-29 06:54:40.126 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399265.1258712, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:40 np0005539504 nova_compute[187152]: 2025-11-29 06:54:40.127 187156 INFO nova.compute.manager [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:54:40 np0005539504 nova_compute[187152]: 2025-11-29 06:54:40.156 187156 DEBUG nova.compute.manager [None req-fc7735b8-5d7e-4373-b0fb-b067afb5bf6f - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:40 np0005539504 nova_compute[187152]: 2025-11-29 06:54:40.211 187156 DEBUG nova.network.neutron [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:54:40 np0005539504 nova_compute[187152]: 2025-11-29 06:54:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.191 187156 DEBUG nova.network.neutron [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.208 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Releasing lock "refresh_cache-2213bf07-673d-465e-aa69-032b0bdc9ff2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.209 187156 DEBUG nova.compute.manager [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:54:41 np0005539504 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Nov 29 01:54:41 np0005539504 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001c.scope: Consumed 15.191s CPU time.
Nov 29 01:54:41 np0005539504 systemd-machined[153423]: Machine qemu-15-instance-0000001c terminated.
Nov 29 01:54:41 np0005539504 podman[218344]: 2025-11-29 06:54:41.370714888 +0000 UTC m=+0.080647232 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.477 187156 INFO nova.virt.libvirt.driver [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Instance destroyed successfully.#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.478 187156 DEBUG nova.objects.instance [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lazy-loading 'resources' on Instance uuid 2213bf07-673d-465e-aa69-032b0bdc9ff2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.494 187156 INFO nova.virt.libvirt.driver [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Deleting instance files /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2_del#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.495 187156 INFO nova.virt.libvirt.driver [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Deletion of /var/lib/nova/instances/2213bf07-673d-465e-aa69-032b0bdc9ff2_del complete#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.598 187156 DEBUG nova.network.neutron [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Updating instance_info_cache with network_info: [{"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.604 187156 INFO nova.compute.manager [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.605 187156 DEBUG oslo.service.loopingcall [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.605 187156 DEBUG nova.compute.manager [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.605 187156 DEBUG nova.network.neutron [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.618 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Releasing lock "refresh_cache-b108f341-92ea-4b22-91ee-0d1001229908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.619 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Instance network_info: |[{"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.620 187156 DEBUG oslo_concurrency.lockutils [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b108f341-92ea-4b22-91ee-0d1001229908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.621 187156 DEBUG nova.network.neutron [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Refreshing network info cache for port 60cb0054-846e-4b01-967b-f0041958e737 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.627 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Start _get_guest_xml network_info=[{"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-11-29T06:54:26Z,direct_url=<?>,disk_format='qcow2',id=087ac09b-91ea-4cd3-b62e-842ca8fac86a,min_disk=1,min_ram=0,name='tempest-test-snap-83112434',owner='78f8ba841bbe4fdcb9d9e2237d97bf73',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-11-29T06:54:31Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '087ac09b-91ea-4cd3-b62e-842ca8fac86a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.637 187156 WARNING nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.650 187156 DEBUG nova.virt.libvirt.host [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.652 187156 DEBUG nova.virt.libvirt.host [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.657 187156 DEBUG nova.virt.libvirt.host [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.657 187156 DEBUG nova.virt.libvirt.host [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.660 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.660 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-11-29T06:54:26Z,direct_url=<?>,disk_format='qcow2',id=087ac09b-91ea-4cd3-b62e-842ca8fac86a,min_disk=1,min_ram=0,name='tempest-test-snap-83112434',owner='78f8ba841bbe4fdcb9d9e2237d97bf73',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-11-29T06:54:31Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.660 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.661 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.661 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.661 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.662 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.662 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.662 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.662 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.663 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.663 187156 DEBUG nova.virt.hardware [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.668 187156 DEBUG nova.virt.libvirt.vif [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-776698232',display_name='tempest-ImagesTestJSON-server-776698232',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-776698232',id=32,image_ref='087ac09b-91ea-4cd3-b62e-842ca8fac86a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-10oa4hya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='ca75cc77-948e-4e28-a5db-b95961a337a5',image_min_disk='1',image_min_ram='0',image_owner_id='78f8ba841bbe4fdcb9d9e2237d97bf73',image_owner_project_name='tempest-ImagesTestJSON-1674785298',image_owner_user_name='tempest-ImagesTestJSON-1674785298-project-member',image_user_id='315be492c2ce4b9f8af2898e6794a256',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:37Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=b108f341-92ea-4b22-91ee-0d1001229908,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.668 187156 DEBUG nova.network.os_vif_util [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.669 187156 DEBUG nova.network.os_vif_util [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.670 187156 DEBUG nova.objects.instance [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid b108f341-92ea-4b22-91ee-0d1001229908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.685 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <uuid>b108f341-92ea-4b22-91ee-0d1001229908</uuid>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <name>instance-00000020</name>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:name>tempest-ImagesTestJSON-server-776698232</nova:name>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:54:41</nova:creationTime>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:user uuid="315be492c2ce4b9f8af2898e6794a256">tempest-ImagesTestJSON-1674785298-project-member</nova:user>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:project uuid="78f8ba841bbe4fdcb9d9e2237d97bf73">tempest-ImagesTestJSON-1674785298</nova:project>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="087ac09b-91ea-4cd3-b62e-842ca8fac86a"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        <nova:port uuid="60cb0054-846e-4b01-967b-f0041958e737">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <entry name="serial">b108f341-92ea-4b22-91ee-0d1001229908</entry>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <entry name="uuid">b108f341-92ea-4b22-91ee-0d1001229908</entry>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.config"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:76:bd:1a"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <target dev="tap60cb0054-84"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/console.log" append="off"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:54:41 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:54:41 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:54:41 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:54:41 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.687 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Preparing to wait for external event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.687 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.688 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.688 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.690 187156 DEBUG nova.virt.libvirt.vif [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-776698232',display_name='tempest-ImagesTestJSON-server-776698232',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-776698232',id=32,image_ref='087ac09b-91ea-4cd3-b62e-842ca8fac86a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-10oa4hya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='ca75cc77-948e-4e28-a5db-b95961a337a5',image_min_disk='1',image_min_ram='0',image_owner_id='78f8ba841bbe4fdcb9d9e2237d97bf73',image_owner_project_name='tempest-ImagesTestJSON-1674785298',image_owner_user_name='tempest-ImagesTestJSON-1674785298-project-member',image_user_id='315be492c2ce4b9f8af2898e6794a256',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:37Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=b108f341-92ea-4b22-91ee-0d1001229908,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.690 187156 DEBUG nova.network.os_vif_util [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.691 187156 DEBUG nova.network.os_vif_util [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.692 187156 DEBUG os_vif [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.693 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.694 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.694 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.699 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.699 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60cb0054-84, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.700 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60cb0054-84, col_values=(('external_ids', {'iface-id': '60cb0054-846e-4b01-967b-f0041958e737', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:bd:1a', 'vm-uuid': 'b108f341-92ea-4b22-91ee-0d1001229908'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.702 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:41 np0005539504 NetworkManager[55210]: <info>  [1764399281.7045] manager: (tap60cb0054-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.707 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.714 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.717 187156 INFO os_vif [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84')#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.732 187156 DEBUG nova.network.neutron [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.753 187156 DEBUG nova.network.neutron [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.773 187156 INFO nova.compute.manager [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Took 0.17 seconds to deallocate network for instance.#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.779 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.780 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.780 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No VIF found with MAC fa:16:3e:76:bd:1a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.781 187156 INFO nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Using config drive#033[00m
Nov 29 01:54:41 np0005539504 nova_compute[187152]: 2025-11-29 06:54:41.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.022 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.022 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.023 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.023 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.056 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.057 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.122 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.162 187156 DEBUG nova.compute.provider_tree [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.170 187156 INFO nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Creating config drive at /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.config#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.180 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppe436t1l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.211 187156 DEBUG nova.scheduler.client.report [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.218 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.219 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.251 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.433 187156 DEBUG oslo_concurrency.processutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppe436t1l" returned: 0 in 0.253s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.437 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908/disk --force-share --output=json" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.470 187156 INFO nova.scheduler.client.report [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Deleted allocations for instance 2213bf07-673d-465e-aa69-032b0bdc9ff2#033[00m
Nov 29 01:54:42 np0005539504 kernel: tap60cb0054-84: entered promiscuous mode
Nov 29 01:54:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:42Z|00103|binding|INFO|Claiming lport 60cb0054-846e-4b01-967b-f0041958e737 for this chassis.
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.528 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:42Z|00104|binding|INFO|60cb0054-846e-4b01-967b-f0041958e737: Claiming fa:16:3e:76:bd:1a 10.100.0.14
Nov 29 01:54:42 np0005539504 systemd-udevd[218354]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:42 np0005539504 NetworkManager[55210]: <info>  [1764399282.5318] manager: (tap60cb0054-84): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.532 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 NetworkManager[55210]: <info>  [1764399282.5432] device (tap60cb0054-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:54:42 np0005539504 NetworkManager[55210]: <info>  [1764399282.5440] device (tap60cb0054-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.543 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:bd:1a 10.100.0.14'], port_security=['fa:16:3e:76:bd:1a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b108f341-92ea-4b22-91ee-0d1001229908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60cb0054-846e-4b01-967b-f0041958e737) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.545 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60cb0054-846e-4b01-967b-f0041958e737 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba bound to our chassis#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.546 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.555 187156 DEBUG oslo_concurrency.lockutils [None req-6bd55afb-f24b-4013-a11d-54ac998897eb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2213bf07-673d-465e-aa69-032b0bdc9ff2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.559 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2835d6-966c-4c3c-9813-b70801ab3f40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.560 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ec2ca4-31 in ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.562 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ec2ca4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.562 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a09b850f-1157-4fbd-9e1b-623c67bfe684]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.563 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b61394-471f-49e6-85c6-d363b41d5e73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.577 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[d74631ad-647f-4251-8be9-7352746141cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 systemd-machined[153423]: New machine qemu-17-instance-00000020.
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.591 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.594 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd3bbac-b6b0-4650-a57f-1c64bac88e98]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:42Z|00105|binding|INFO|Setting lport 60cb0054-846e-4b01-967b-f0041958e737 ovn-installed in OVS
Nov 29 01:54:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:42Z|00106|binding|INFO|Setting lport 60cb0054-846e-4b01-967b-f0041958e737 up in Southbound
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.597 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 systemd[1]: Started Virtual Machine qemu-17-instance-00000020.
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.629 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ea33cee3-b72c-4031-b2c3-e7bef4eb2528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 NetworkManager[55210]: <info>  [1764399282.6358] manager: (tap17ec2ca4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.634 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc100df-30e6-484f-a743-04026bc011aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 systemd-udevd[218400]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.665 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6219e678-589e-4416-827f-d34eaeaec2c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.668 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a7123b48-69fa-434e-8d25-ed46a66f7b8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 NetworkManager[55210]: <info>  [1764399282.6910] device (tap17ec2ca4-30): carrier: link connected
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.696 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[24503097-4709-481f-a6e5-5029fd8002e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.713 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6adb4068-f540-4c15-a470-721895282298]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474557, 'reachable_time': 21299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218434, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.715 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.717 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5741MB free_disk=73.20581436157227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.717 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.717 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.727 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3b9f3a20-aafe-49df-9b1c-e016809bb8a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:556b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474557, 'tstamp': 474557}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218435, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.746 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b0570f-f818-487e-841c-380e0c2ac687]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474557, 'reachable_time': 21299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218436, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.775 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2f4ba6-e847-428c-b447-8c4583d7bf69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.779 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance b108f341-92ea-4b22-91ee-0d1001229908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.780 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.780 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.818 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.826 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eafda4c9-57fc-462f-b244-53583dae105b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.827 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.827 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.828 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ec2ca4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.830 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 kernel: tap17ec2ca4-30: entered promiscuous mode
Nov 29 01:54:42 np0005539504 NetworkManager[55210]: <info>  [1764399282.8332] manager: (tap17ec2ca4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.833 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ec2ca4-30, col_values=(('external_ids', {'iface-id': '97d66506-c891-4bf7-8595-2d091560f247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:42 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:42Z|00107|binding|INFO|Releasing lport 97d66506-c891-4bf7-8595-2d091560f247 from this chassis (sb_readonly=0)
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.834 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.845 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.851 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.852 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.852 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[db580ee9-1804-4643-9879-e24f472e85fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.853 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:54:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:42.854 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'env', 'PROCESS_TAG=haproxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.876 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:54:42 np0005539504 nova_compute[187152]: 2025-11-29 06:54:42.877 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.062 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399283.0617476, b108f341-92ea-4b22-91ee-0d1001229908 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.062 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] VM Started (Lifecycle Event)#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.079 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.083 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399283.061955, b108f341-92ea-4b22-91ee-0d1001229908 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.084 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.100 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.104 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.122 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:43 np0005539504 podman[218475]: 2025-11-29 06:54:43.254427809 +0000 UTC m=+0.056172070 container create 419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 01:54:43 np0005539504 systemd[1]: Started libpod-conmon-419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0.scope.
Nov 29 01:54:43 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:54:43 np0005539504 podman[218475]: 2025-11-29 06:54:43.222903927 +0000 UTC m=+0.024648208 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:54:43 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc36e99bc4c87b30c9525d66c6bc4cc4245a6f702c836af95795c8fe2e6bb02e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:54:43 np0005539504 podman[218475]: 2025-11-29 06:54:43.340730603 +0000 UTC m=+0.142474884 container init 419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:43 np0005539504 podman[218475]: 2025-11-29 06:54:43.346035197 +0000 UTC m=+0.147779458 container start 419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:54:43 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [NOTICE]   (218494) : New worker (218496) forked
Nov 29 01:54:43 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [NOTICE]   (218494) : Loading success.
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.523 187156 DEBUG nova.compute.manager [req-521b9805-e758-4de5-821b-8fbbcfcdf245 req-a5db05c1-0f7d-4016-b8a1-66753e04c0c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.523 187156 DEBUG oslo_concurrency.lockutils [req-521b9805-e758-4de5-821b-8fbbcfcdf245 req-a5db05c1-0f7d-4016-b8a1-66753e04c0c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.524 187156 DEBUG oslo_concurrency.lockutils [req-521b9805-e758-4de5-821b-8fbbcfcdf245 req-a5db05c1-0f7d-4016-b8a1-66753e04c0c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.524 187156 DEBUG oslo_concurrency.lockutils [req-521b9805-e758-4de5-821b-8fbbcfcdf245 req-a5db05c1-0f7d-4016-b8a1-66753e04c0c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.525 187156 DEBUG nova.compute.manager [req-521b9805-e758-4de5-821b-8fbbcfcdf245 req-a5db05c1-0f7d-4016-b8a1-66753e04c0c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Processing event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.525 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.536 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399283.5324564, b108f341-92ea-4b22-91ee-0d1001229908 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.536 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.539 187156 DEBUG nova.virt.libvirt.driver [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.544 187156 INFO nova.virt.libvirt.driver [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Instance spawned successfully.#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.545 187156 INFO nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Took 6.41 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.545 187156 DEBUG nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.576 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.581 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.638 187156 INFO nova.compute.manager [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Took 7.15 seconds to build instance.#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.662 187156 DEBUG oslo_concurrency.lockutils [None req-5a88ebea-7434-4c74-a4ef-52f585627cb6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.783 187156 DEBUG nova.network.neutron [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Updated VIF entry in instance network info cache for port 60cb0054-846e-4b01-967b-f0041958e737. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.783 187156 DEBUG nova.network.neutron [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Updating instance_info_cache with network_info: [{"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:43 np0005539504 nova_compute[187152]: 2025-11-29 06:54:43.813 187156 DEBUG oslo_concurrency.lockutils [req-13dd5048-a54f-49da-8220-f58729477133 req-cfdb1b2f-5cc5-4e76-953e-f0b65969dc80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b108f341-92ea-4b22-91ee-0d1001229908" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:44 np0005539504 nova_compute[187152]: 2025-11-29 06:54:44.187 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.247 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.248 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.248 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.249 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.249 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.261 187156 INFO nova.compute.manager [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Terminating instance#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.273 187156 DEBUG nova.compute.manager [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:54:45 np0005539504 kernel: tap60cb0054-84 (unregistering): left promiscuous mode
Nov 29 01:54:45 np0005539504 NetworkManager[55210]: <info>  [1764399285.2957] device (tap60cb0054-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.349 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:45Z|00108|binding|INFO|Releasing lport 60cb0054-846e-4b01-967b-f0041958e737 from this chassis (sb_readonly=0)
Nov 29 01:54:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:45Z|00109|binding|INFO|Setting lport 60cb0054-846e-4b01-967b-f0041958e737 down in Southbound
Nov 29 01:54:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:54:45Z|00110|binding|INFO|Removing iface tap60cb0054-84 ovn-installed in OVS
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.357 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:bd:1a 10.100.0.14'], port_security=['fa:16:3e:76:bd:1a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b108f341-92ea-4b22-91ee-0d1001229908', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60cb0054-846e-4b01-967b-f0041958e737) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.358 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60cb0054-846e-4b01-967b-f0041958e737 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba unbound from our chassis#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.359 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.360 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3c699266-2ad5-402d-9094-c6f62774201c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.361 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace which is not needed anymore#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.363 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000020.scope: Deactivated successfully.
Nov 29 01:54:45 np0005539504 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000020.scope: Consumed 2.271s CPU time.
Nov 29 01:54:45 np0005539504 systemd-machined[153423]: Machine qemu-17-instance-00000020 terminated.
Nov 29 01:54:45 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [NOTICE]   (218494) : haproxy version is 2.8.14-c23fe91
Nov 29 01:54:45 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [NOTICE]   (218494) : path to executable is /usr/sbin/haproxy
Nov 29 01:54:45 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [WARNING]  (218494) : Exiting Master process...
Nov 29 01:54:45 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [ALERT]    (218494) : Current worker (218496) exited with code 143 (Terminated)
Nov 29 01:54:45 np0005539504 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218490]: [WARNING]  (218494) : All workers exited. Exiting... (0)
Nov 29 01:54:45 np0005539504 systemd[1]: libpod-419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0.scope: Deactivated successfully.
Nov 29 01:54:45 np0005539504 conmon[218490]: conmon 419f74f06032d324062a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0.scope/container/memory.events
Nov 29 01:54:45 np0005539504 podman[218526]: 2025-11-29 06:54:45.51195154 +0000 UTC m=+0.053195950 container died 419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.531 187156 INFO nova.virt.libvirt.driver [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Instance destroyed successfully.#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.531 187156 DEBUG nova.objects.instance [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'resources' on Instance uuid b108f341-92ea-4b22-91ee-0d1001229908 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:45 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0-userdata-shm.mount: Deactivated successfully.
Nov 29 01:54:45 np0005539504 systemd[1]: var-lib-containers-storage-overlay-fc36e99bc4c87b30c9525d66c6bc4cc4245a6f702c836af95795c8fe2e6bb02e-merged.mount: Deactivated successfully.
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.545 187156 DEBUG nova.virt.libvirt.vif [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-776698232',display_name='tempest-ImagesTestJSON-server-776698232',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagestestjson-server-776698232',id=32,image_ref='087ac09b-91ea-4cd3-b62e-842ca8fac86a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-10oa4hya',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='ca75cc77-948e-4e28-a5db-b95961a337a5',image_min_disk='1',image_min_ram='0',image_owner_id='78f8ba841bbe4fdcb9d9e2237d97bf73',image_owner_project_name='tempest-ImagesTestJSON-1674785298',image_owner_user_name='tempest-ImagesTestJSON-1674785298-project-member',image_user_id='315be492c2ce4b9f8af2898e6794a256',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:43Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=b108f341-92ea-4b22-91ee-0d1001229908,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.546 187156 DEBUG nova.network.os_vif_util [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "60cb0054-846e-4b01-967b-f0041958e737", "address": "fa:16:3e:76:bd:1a", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cb0054-84", "ovs_interfaceid": "60cb0054-846e-4b01-967b-f0041958e737", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:45 np0005539504 podman[218526]: 2025-11-29 06:54:45.547544752 +0000 UTC m=+0.088789162 container cleanup 419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.547 187156 DEBUG nova.network.os_vif_util [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.547 187156 DEBUG os_vif [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.549 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.550 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60cb0054-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.551 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.552 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.555 187156 INFO os_vif [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:bd:1a,bridge_name='br-int',has_traffic_filtering=True,id=60cb0054-846e-4b01-967b-f0041958e737,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cb0054-84')#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.556 187156 INFO nova.virt.libvirt.driver [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Deleting instance files /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908_del#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.557 187156 INFO nova.virt.libvirt.driver [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Deletion of /var/lib/nova/instances/b108f341-92ea-4b22-91ee-0d1001229908_del complete#033[00m
Nov 29 01:54:45 np0005539504 systemd[1]: libpod-conmon-419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0.scope: Deactivated successfully.
Nov 29 01:54:45 np0005539504 podman[218571]: 2025-11-29 06:54:45.61547549 +0000 UTC m=+0.043624622 container remove 419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.621 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0e723eb0-973d-4663-a626-cbbe7d206d9a]: (4, ('Sat Nov 29 06:54:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0)\n419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0\nSat Nov 29 06:54:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0)\n419f74f06032d324062ab79fead382c445607af30fe2c8d418f0382d28bda4f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.623 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d9fa1585-57a1-4266-8b9a-3848baae0242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.625 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:45 np0005539504 kernel: tap17ec2ca4-30: left promiscuous mode
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.633 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[08602be3-80be-4cf3-821f-2edc7b46761e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.635 187156 INFO nova.compute.manager [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.635 187156 DEBUG oslo.service.loopingcall [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.635 187156 DEBUG nova.compute.manager [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.636 187156 DEBUG nova.network.neutron [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.643 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.661 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[267de51a-003a-41ba-8a8f-50a477dc2910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.663 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9acfa7ee-940b-4d09-a407-705635f97b17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.677 187156 DEBUG nova.compute.manager [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.677 187156 DEBUG oslo_concurrency.lockutils [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.678 187156 DEBUG oslo_concurrency.lockutils [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.678 187156 DEBUG oslo_concurrency.lockutils [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.678 187156 DEBUG nova.compute.manager [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] No waiting events found dispatching network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.678 187156 WARNING nova.compute.manager [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received unexpected event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.679 187156 DEBUG nova.compute.manager [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-vif-unplugged-60cb0054-846e-4b01-967b-f0041958e737 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.679 187156 DEBUG oslo_concurrency.lockutils [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.679 187156 DEBUG oslo_concurrency.lockutils [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.679 187156 DEBUG oslo_concurrency.lockutils [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.679 187156 DEBUG nova.compute.manager [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] No waiting events found dispatching network-vif-unplugged-60cb0054-846e-4b01-967b-f0041958e737 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.679 187156 DEBUG nova.compute.manager [req-9a08e025-106c-44ff-92e8-e812f80ae53d req-3e509695-b36a-43ab-95e8-75f98b9c7c55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-vif-unplugged-60cb0054-846e-4b01-967b-f0041958e737 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.682 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6719337d-b830-49f3-bd98-8a1b69fe8161]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474550, 'reachable_time': 33883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218587, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 systemd[1]: run-netns-ovnmeta\x2d17ec2ca4\x2d3fa9\x2d41aa\x2d80ef\x2d35bf92d404ba.mount: Deactivated successfully.
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.689 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:54:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:54:45.689 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[182b4c48-106e-4954-84d1-c64619d04ac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.877 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.878 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.879 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.898 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.899 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.899 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.900 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:54:45 np0005539504 nova_compute[187152]: 2025-11-29 06:54:45.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:46 np0005539504 nova_compute[187152]: 2025-11-29 06:54:46.825 187156 DEBUG nova.network.neutron [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:46 np0005539504 nova_compute[187152]: 2025-11-29 06:54:46.872 187156 INFO nova.compute.manager [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Took 1.24 seconds to deallocate network for instance.#033[00m
Nov 29 01:54:46 np0005539504 nova_compute[187152]: 2025-11-29 06:54:46.986 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399271.9845347, 2e380200-8276-4470-965f-31baa0bfd760 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:46 np0005539504 nova_compute[187152]: 2025-11-29 06:54:46.987 187156 INFO nova.compute.manager [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.002 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.003 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.025 187156 DEBUG nova.compute.manager [None req-94b7bcfd-e1c1-44e1-b181-0a0568e98dee - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.064 187156 DEBUG nova.compute.provider_tree [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.491 187156 DEBUG nova.scheduler.client.report [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.562 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.592 187156 INFO nova.scheduler.client.report [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Deleted allocations for instance b108f341-92ea-4b22-91ee-0d1001229908#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.698 187156 DEBUG oslo_concurrency.lockutils [None req-c3055c57-592c-4451-99a5-195fdc726258 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.872 187156 DEBUG nova.compute.manager [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.872 187156 DEBUG oslo_concurrency.lockutils [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b108f341-92ea-4b22-91ee-0d1001229908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.873 187156 DEBUG oslo_concurrency.lockutils [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.873 187156 DEBUG oslo_concurrency.lockutils [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b108f341-92ea-4b22-91ee-0d1001229908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.873 187156 DEBUG nova.compute.manager [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] No waiting events found dispatching network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.873 187156 WARNING nova.compute.manager [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received unexpected event network-vif-plugged-60cb0054-846e-4b01-967b-f0041958e737 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:54:47 np0005539504 nova_compute[187152]: 2025-11-29 06:54:47.874 187156 DEBUG nova.compute.manager [req-f0d5fdba-1919-4a42-86b6-0506fc68453b req-9ae2e11e-4e1d-4905-a45f-f7c8cb6725de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Received event network-vif-deleted-60cb0054-846e-4b01-967b-f0041958e737 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.956 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.958 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.958 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.958 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.958 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.959 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.960 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.961 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.961 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.961 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:54:47.961 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:54:48 np0005539504 podman[218588]: 2025-11-29 06:54:48.707725543 +0000 UTC m=+0.053423416 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:54:49 np0005539504 nova_compute[187152]: 2025-11-29 06:54:49.189 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:50 np0005539504 nova_compute[187152]: 2025-11-29 06:54:50.553 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:51 np0005539504 podman[218609]: 2025-11-29 06:54:51.714943157 +0000 UTC m=+0.054520336 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:54:51 np0005539504 podman[218610]: 2025-11-29 06:54:51.721919716 +0000 UTC m=+0.059267374 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Nov 29 01:54:54 np0005539504 nova_compute[187152]: 2025-11-29 06:54:54.191 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:55 np0005539504 nova_compute[187152]: 2025-11-29 06:54:55.592 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:56 np0005539504 nova_compute[187152]: 2025-11-29 06:54:56.475 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399281.4728546, 2213bf07-673d-465e-aa69-032b0bdc9ff2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:56 np0005539504 nova_compute[187152]: 2025-11-29 06:54:56.475 187156 INFO nova.compute.manager [-] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:54:56 np0005539504 nova_compute[187152]: 2025-11-29 06:54:56.506 187156 DEBUG nova.compute.manager [None req-37bd04ee-deee-4441-9eb6-30af49448854 - - - - - -] [instance: 2213bf07-673d-465e-aa69-032b0bdc9ff2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:59 np0005539504 nova_compute[187152]: 2025-11-29 06:54:59.193 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:00 np0005539504 nova_compute[187152]: 2025-11-29 06:55:00.531 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399285.528641, b108f341-92ea-4b22-91ee-0d1001229908 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:55:00 np0005539504 nova_compute[187152]: 2025-11-29 06:55:00.531 187156 INFO nova.compute.manager [-] [instance: b108f341-92ea-4b22-91ee-0d1001229908] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:55:00 np0005539504 nova_compute[187152]: 2025-11-29 06:55:00.584 187156 DEBUG nova.compute.manager [None req-d566bcf0-57d7-43bd-bfd8-e73d47200305 - - - - - -] [instance: b108f341-92ea-4b22-91ee-0d1001229908] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:00 np0005539504 nova_compute[187152]: 2025-11-29 06:55:00.595 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:01 np0005539504 podman[218653]: 2025-11-29 06:55:01.737463786 +0000 UTC m=+0.083214121 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:55:01 np0005539504 podman[218654]: 2025-11-29 06:55:01.740291803 +0000 UTC m=+0.082954904 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:55:04 np0005539504 nova_compute[187152]: 2025-11-29 06:55:04.195 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:05 np0005539504 nova_compute[187152]: 2025-11-29 06:55:05.632 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:06 np0005539504 podman[218704]: 2025-11-29 06:55:06.739874604 +0000 UTC m=+0.074937697 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 01:55:06 np0005539504 nova_compute[187152]: 2025-11-29 06:55:06.956 187156 DEBUG nova.compute.manager [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.154 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.155 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.186 187156 DEBUG nova.objects.instance [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'pci_requests' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.208 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.209 187156 INFO nova.compute.claims [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.210 187156 DEBUG nova.objects.instance [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'resources' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.226 187156 DEBUG nova.objects.instance [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'numa_topology' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.238 187156 DEBUG nova.objects.instance [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.300 187156 INFO nova.compute.resource_tracker [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating resource usage from migration 3da3d3f5-2569-489b-86cb-65e3eee7704d#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.300 187156 DEBUG nova.compute.resource_tracker [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Starting to track incoming migration 3da3d3f5-2569-489b-86cb-65e3eee7704d with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.486 187156 DEBUG nova.compute.provider_tree [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.520 187156 DEBUG nova.scheduler.client.report [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.607 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:07 np0005539504 nova_compute[187152]: 2025-11-29 06:55:07.607 187156 INFO nova.compute.manager [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Migrating#033[00m
Nov 29 01:55:09 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:55:09 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:55:09 np0005539504 systemd-logind[783]: New session 31 of user nova.
Nov 29 01:55:09 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:55:09 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:55:09 np0005539504 nova_compute[187152]: 2025-11-29 06:55:09.245 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:09 np0005539504 systemd[218728]: Queued start job for default target Main User Target.
Nov 29 01:55:09 np0005539504 systemd[218728]: Created slice User Application Slice.
Nov 29 01:55:09 np0005539504 systemd[218728]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:09 np0005539504 systemd[218728]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:09 np0005539504 systemd[218728]: Reached target Paths.
Nov 29 01:55:09 np0005539504 systemd[218728]: Reached target Timers.
Nov 29 01:55:09 np0005539504 systemd[218728]: Starting D-Bus User Message Bus Socket...
Nov 29 01:55:09 np0005539504 systemd[218728]: Starting Create User's Volatile Files and Directories...
Nov 29 01:55:09 np0005539504 systemd[218728]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:55:09 np0005539504 systemd[218728]: Reached target Sockets.
Nov 29 01:55:09 np0005539504 systemd[218728]: Finished Create User's Volatile Files and Directories.
Nov 29 01:55:09 np0005539504 systemd[218728]: Reached target Basic System.
Nov 29 01:55:09 np0005539504 systemd[218728]: Reached target Main User Target.
Nov 29 01:55:09 np0005539504 systemd[218728]: Startup finished in 178ms.
Nov 29 01:55:09 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:55:09 np0005539504 systemd[1]: Started Session 31 of User nova.
Nov 29 01:55:09 np0005539504 systemd[1]: session-31.scope: Deactivated successfully.
Nov 29 01:55:09 np0005539504 systemd-logind[783]: Session 31 logged out. Waiting for processes to exit.
Nov 29 01:55:09 np0005539504 systemd-logind[783]: Removed session 31.
Nov 29 01:55:09 np0005539504 systemd-logind[783]: New session 33 of user nova.
Nov 29 01:55:09 np0005539504 systemd[1]: Started Session 33 of User nova.
Nov 29 01:55:09 np0005539504 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 01:55:09 np0005539504 systemd-logind[783]: Session 33 logged out. Waiting for processes to exit.
Nov 29 01:55:09 np0005539504 systemd-logind[783]: Removed session 33.
Nov 29 01:55:10 np0005539504 nova_compute[187152]: 2025-11-29 06:55:10.635 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:11 np0005539504 podman[218750]: 2025-11-29 06:55:11.732674636 +0000 UTC m=+0.075755989 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:55:14 np0005539504 nova_compute[187152]: 2025-11-29 06:55:14.247 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:15 np0005539504 nova_compute[187152]: 2025-11-29 06:55:15.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:19 np0005539504 nova_compute[187152]: 2025-11-29 06:55:19.249 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:19 np0005539504 podman[218772]: 2025-11-29 06:55:19.733010927 +0000 UTC m=+0.072936112 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:55:19 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:55:19 np0005539504 systemd[218728]: Activating special unit Exit the Session...
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped target Main User Target.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped target Basic System.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped target Paths.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped target Sockets.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped target Timers.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:19 np0005539504 systemd[218728]: Closed D-Bus User Message Bus Socket.
Nov 29 01:55:19 np0005539504 systemd[218728]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:55:19 np0005539504 systemd[218728]: Removed slice User Application Slice.
Nov 29 01:55:19 np0005539504 systemd[218728]: Reached target Shutdown.
Nov 29 01:55:19 np0005539504 systemd[218728]: Finished Exit the Session.
Nov 29 01:55:19 np0005539504 systemd[218728]: Reached target Exit the Session.
Nov 29 01:55:19 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:55:19 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:55:19 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:55:19 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:55:19 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:55:19 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:55:19 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:55:20 np0005539504 nova_compute[187152]: 2025-11-29 06:55:20.694 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:55:22.911 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:55:22.911 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:55:22.911 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:23 np0005539504 podman[218795]: 2025-11-29 06:55:23.978765895 +0000 UTC m=+0.046953836 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:55:24 np0005539504 podman[218798]: 2025-11-29 06:55:24.014566518 +0000 UTC m=+0.079484801 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Nov 29 01:55:24 np0005539504 systemd-logind[783]: New session 34 of user nova.
Nov 29 01:55:24 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:55:24 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:55:24 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:55:24 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:55:24 np0005539504 systemd[218846]: Queued start job for default target Main User Target.
Nov 29 01:55:24 np0005539504 systemd[218846]: Created slice User Application Slice.
Nov 29 01:55:24 np0005539504 systemd[218846]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:24 np0005539504 systemd[218846]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:24 np0005539504 systemd[218846]: Reached target Paths.
Nov 29 01:55:24 np0005539504 systemd[218846]: Reached target Timers.
Nov 29 01:55:24 np0005539504 systemd[218846]: Starting D-Bus User Message Bus Socket...
Nov 29 01:55:24 np0005539504 systemd[218846]: Starting Create User's Volatile Files and Directories...
Nov 29 01:55:24 np0005539504 nova_compute[187152]: 2025-11-29 06:55:24.250 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:24 np0005539504 systemd[218846]: Finished Create User's Volatile Files and Directories.
Nov 29 01:55:24 np0005539504 systemd[218846]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:55:24 np0005539504 systemd[218846]: Reached target Sockets.
Nov 29 01:55:24 np0005539504 systemd[218846]: Reached target Basic System.
Nov 29 01:55:24 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:55:24 np0005539504 systemd[218846]: Reached target Main User Target.
Nov 29 01:55:24 np0005539504 systemd[218846]: Startup finished in 149ms.
Nov 29 01:55:24 np0005539504 systemd[1]: Started Session 34 of User nova.
Nov 29 01:55:24 np0005539504 systemd[1]: session-34.scope: Deactivated successfully.
Nov 29 01:55:24 np0005539504 systemd-logind[783]: Session 34 logged out. Waiting for processes to exit.
Nov 29 01:55:24 np0005539504 systemd-logind[783]: Removed session 34.
Nov 29 01:55:24 np0005539504 systemd-logind[783]: New session 36 of user nova.
Nov 29 01:55:24 np0005539504 systemd[1]: Started Session 36 of User nova.
Nov 29 01:55:24 np0005539504 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 01:55:24 np0005539504 systemd-logind[783]: Session 36 logged out. Waiting for processes to exit.
Nov 29 01:55:24 np0005539504 systemd-logind[783]: Removed session 36.
Nov 29 01:55:25 np0005539504 systemd-logind[783]: New session 37 of user nova.
Nov 29 01:55:25 np0005539504 systemd[1]: Started Session 37 of User nova.
Nov 29 01:55:25 np0005539504 systemd[1]: session-37.scope: Deactivated successfully.
Nov 29 01:55:25 np0005539504 systemd-logind[783]: Session 37 logged out. Waiting for processes to exit.
Nov 29 01:55:25 np0005539504 systemd-logind[783]: Removed session 37.
Nov 29 01:55:25 np0005539504 nova_compute[187152]: 2025-11-29 06:55:25.698 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:26 np0005539504 nova_compute[187152]: 2025-11-29 06:55:26.251 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:55:26 np0005539504 nova_compute[187152]: 2025-11-29 06:55:26.252 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquired lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:55:26 np0005539504 nova_compute[187152]: 2025-11-29 06:55:26.253 187156 DEBUG nova.network.neutron [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:55:26 np0005539504 nova_compute[187152]: 2025-11-29 06:55:26.705 187156 DEBUG nova.network.neutron [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:55:27 np0005539504 nova_compute[187152]: 2025-11-29 06:55:27.932 187156 DEBUG nova.network.neutron [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:55:27 np0005539504 nova_compute[187152]: 2025-11-29 06:55:27.955 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Releasing lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.128 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.130 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.131 187156 INFO nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Creating image(s)#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.132 187156 DEBUG nova.objects.instance [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.145 187156 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.224 187156 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.226 187156 DEBUG nova.virt.disk.api [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Checking if we can resize image /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.227 187156 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.317 187156 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.319 187156 DEBUG nova.virt.disk.api [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Cannot resize image /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.342 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.343 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Ensure instance console log exists: /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.343 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.344 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.344 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.347 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.354 187156 WARNING nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.373 187156 DEBUG nova.virt.libvirt.host [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.374 187156 DEBUG nova.virt.libvirt.host [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.377 187156 DEBUG nova.virt.libvirt.host [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.377 187156 DEBUG nova.virt.libvirt.host [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.379 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.379 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.380 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.380 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.380 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.381 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.381 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.381 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.382 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.382 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.382 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.383 187156 DEBUG nova.virt.hardware [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.383 187156 DEBUG nova.objects.instance [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.410 187156 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.496 187156 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.498 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.499 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.501 187156 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.506 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <uuid>66b9235f-7cc8-40d4-877b-b690613298a4</uuid>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <name>instance-00000021</name>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:name>tempest-MigrationsAdminTest-server-2086906237</nova:name>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:55:28</nova:creationTime>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:        <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <entry name="serial">66b9235f-7cc8-40d4-877b-b690613298a4</entry>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <entry name="uuid">66b9235f-7cc8-40d4-877b-b690613298a4</entry>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/console.log" append="off"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:55:28 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:55:28 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:55:28 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:55:28 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.568 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.569 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:55:28 np0005539504 nova_compute[187152]: 2025-11-29 06:55:28.570 187156 INFO nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Using config drive#033[00m
Nov 29 01:55:28 np0005539504 systemd-machined[153423]: New machine qemu-18-instance-00000021.
Nov 29 01:55:28 np0005539504 systemd[1]: Started Virtual Machine qemu-18-instance-00000021.
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.006 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399329.0058274, 66b9235f-7cc8-40d4-877b-b690613298a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.007 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.009 187156 DEBUG nova.compute.manager [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.013 187156 INFO nova.virt.libvirt.driver [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance running successfully.#033[00m
Nov 29 01:55:29 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.016 187156 DEBUG nova.virt.libvirt.guest [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.017 187156 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.041 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.048 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.103 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.104 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399329.0072129, 66b9235f-7cc8-40d4-877b-b690613298a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.104 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] VM Started (Lifecycle Event)#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.141 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.145 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.254 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:29 np0005539504 nova_compute[187152]: 2025-11-29 06:55:29.958 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:30 np0005539504 nova_compute[187152]: 2025-11-29 06:55:30.699 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:32 np0005539504 podman[218908]: 2025-11-29 06:55:32.725913085 +0000 UTC m=+0.061034659 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:55:32 np0005539504 podman[218909]: 2025-11-29 06:55:32.759204529 +0000 UTC m=+0.094802365 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:55:33 np0005539504 nova_compute[187152]: 2025-11-29 06:55:33.333 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:55:33.336 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:55:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:55:33.339 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:55:34 np0005539504 nova_compute[187152]: 2025-11-29 06:55:34.255 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:35 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:55:35 np0005539504 systemd[218846]: Activating special unit Exit the Session...
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped target Main User Target.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped target Basic System.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped target Paths.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped target Sockets.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped target Timers.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:35 np0005539504 systemd[218846]: Closed D-Bus User Message Bus Socket.
Nov 29 01:55:35 np0005539504 systemd[218846]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:55:35 np0005539504 systemd[218846]: Removed slice User Application Slice.
Nov 29 01:55:35 np0005539504 systemd[218846]: Reached target Shutdown.
Nov 29 01:55:35 np0005539504 systemd[218846]: Finished Exit the Session.
Nov 29 01:55:35 np0005539504 systemd[218846]: Reached target Exit the Session.
Nov 29 01:55:35 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:55:35 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:55:35 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:55:35 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:55:35 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:55:35 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:55:35 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:55:35 np0005539504 nova_compute[187152]: 2025-11-29 06:55:35.701 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:37 np0005539504 podman[218957]: 2025-11-29 06:55:37.722338135 +0000 UTC m=+0.064680397 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:55:38 np0005539504 nova_compute[187152]: 2025-11-29 06:55:38.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:39 np0005539504 nova_compute[187152]: 2025-11-29 06:55:39.258 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:39 np0005539504 nova_compute[187152]: 2025-11-29 06:55:39.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:40 np0005539504 nova_compute[187152]: 2025-11-29 06:55:40.705 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:40 np0005539504 nova_compute[187152]: 2025-11-29 06:55:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:40 np0005539504 nova_compute[187152]: 2025-11-29 06:55:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:41 np0005539504 nova_compute[187152]: 2025-11-29 06:55:41.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:42 np0005539504 podman[218982]: 2025-11-29 06:55:42.718715894 +0000 UTC m=+0.061489921 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 01:55:42 np0005539504 nova_compute[187152]: 2025-11-29 06:55:42.803 187156 DEBUG nova.compute.manager [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 01:55:42 np0005539504 nova_compute[187152]: 2025-11-29 06:55:42.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:42 np0005539504 nova_compute[187152]: 2025-11-29 06:55:42.946 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:42 np0005539504 nova_compute[187152]: 2025-11-29 06:55:42.947 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:42 np0005539504 nova_compute[187152]: 2025-11-29 06:55:42.998 187156 DEBUG nova.objects.instance [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_requests' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.017 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.018 187156 INFO nova.compute.claims [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.018 187156 DEBUG nova.objects.instance [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.029 187156 DEBUG nova.objects.instance [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.078 187156 INFO nova.compute.resource_tracker [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating resource usage from migration 6f1aff30-664d-4ffd-b820-c05fab910a30#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.079 187156 DEBUG nova.compute.resource_tracker [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Starting to track incoming migration 6f1aff30-664d-4ffd-b820-c05fab910a30 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.177 187156 DEBUG nova.compute.provider_tree [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.199 187156 DEBUG nova.scheduler.client.report [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.224 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.225 187156 INFO nova.compute.manager [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Migrating#033[00m
Nov 29 01:55:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:55:43.342 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.977 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.978 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.978 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:43 np0005539504 nova_compute[187152]: 2025-11-29 06:55:43.978 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.059 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.122 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.124 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.187 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.260 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.359 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.361 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5581MB free_disk=73.17354965209961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.361 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.362 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.441 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration for instance a7c7d375-ef91-4869-987b-662d0c1de55c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.472 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating resource usage from migration 6f1aff30-664d-4ffd-b820-c05fab910a30#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.473 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Starting to track incoming migration 6f1aff30-664d-4ffd-b820-c05fab910a30 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 01:55:44 np0005539504 systemd-logind[783]: New session 38 of user nova.
Nov 29 01:55:44 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:55:44 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:55:44 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:55:44 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.651 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 66b9235f-7cc8-40d4-877b-b690613298a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.699 187156 WARNING nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance a7c7d375-ef91-4869-987b-662d0c1de55c has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.699 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.699 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:55:44 np0005539504 systemd[219012]: Queued start job for default target Main User Target.
Nov 29 01:55:44 np0005539504 systemd[219012]: Created slice User Application Slice.
Nov 29 01:55:44 np0005539504 systemd[219012]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:44 np0005539504 systemd[219012]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:44 np0005539504 systemd[219012]: Reached target Paths.
Nov 29 01:55:44 np0005539504 systemd[219012]: Reached target Timers.
Nov 29 01:55:44 np0005539504 systemd[219012]: Starting D-Bus User Message Bus Socket...
Nov 29 01:55:44 np0005539504 systemd[219012]: Starting Create User's Volatile Files and Directories...
Nov 29 01:55:44 np0005539504 systemd[219012]: Finished Create User's Volatile Files and Directories.
Nov 29 01:55:44 np0005539504 systemd[219012]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:55:44 np0005539504 systemd[219012]: Reached target Sockets.
Nov 29 01:55:44 np0005539504 systemd[219012]: Reached target Basic System.
Nov 29 01:55:44 np0005539504 systemd[219012]: Reached target Main User Target.
Nov 29 01:55:44 np0005539504 systemd[219012]: Startup finished in 138ms.
Nov 29 01:55:44 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.775 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.790 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:44 np0005539504 systemd[1]: Started Session 38 of User nova.
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.816 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:55:44 np0005539504 nova_compute[187152]: 2025-11-29 06:55:44.817 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:44 np0005539504 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 01:55:44 np0005539504 systemd-logind[783]: Session 38 logged out. Waiting for processes to exit.
Nov 29 01:55:44 np0005539504 systemd-logind[783]: Removed session 38.
Nov 29 01:55:44 np0005539504 systemd-logind[783]: New session 40 of user nova.
Nov 29 01:55:45 np0005539504 systemd[1]: Started Session 40 of User nova.
Nov 29 01:55:45 np0005539504 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 01:55:45 np0005539504 systemd-logind[783]: Session 40 logged out. Waiting for processes to exit.
Nov 29 01:55:45 np0005539504 systemd-logind[783]: Removed session 40.
Nov 29 01:55:45 np0005539504 nova_compute[187152]: 2025-11-29 06:55:45.707 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:45 np0005539504 nova_compute[187152]: 2025-11-29 06:55:45.817 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:45 np0005539504 nova_compute[187152]: 2025-11-29 06:55:45.818 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:55:45 np0005539504 nova_compute[187152]: 2025-11-29 06:55:45.818 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:55:46 np0005539504 nova_compute[187152]: 2025-11-29 06:55:46.570 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:55:46 np0005539504 nova_compute[187152]: 2025-11-29 06:55:46.570 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:55:46 np0005539504 nova_compute[187152]: 2025-11-29 06:55:46.570 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:55:46 np0005539504 nova_compute[187152]: 2025-11-29 06:55:46.570 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:46 np0005539504 nova_compute[187152]: 2025-11-29 06:55:46.752 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:55:47 np0005539504 nova_compute[187152]: 2025-11-29 06:55:47.280 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:55:47 np0005539504 nova_compute[187152]: 2025-11-29 06:55:47.300 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:55:47 np0005539504 nova_compute[187152]: 2025-11-29 06:55:47.301 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:55:47 np0005539504 nova_compute[187152]: 2025-11-29 06:55:47.301 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:47 np0005539504 nova_compute[187152]: 2025-11-29 06:55:47.302 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:47 np0005539504 nova_compute[187152]: 2025-11-29 06:55:47.302 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:55:49 np0005539504 nova_compute[187152]: 2025-11-29 06:55:49.262 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:50 np0005539504 nova_compute[187152]: 2025-11-29 06:55:50.710 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:50 np0005539504 podman[219033]: 2025-11-29 06:55:50.742573706 +0000 UTC m=+0.077367922 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:55:54 np0005539504 nova_compute[187152]: 2025-11-29 06:55:54.264 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:54 np0005539504 podman[219054]: 2025-11-29 06:55:54.757405691 +0000 UTC m=+0.080224191 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Nov 29 01:55:54 np0005539504 podman[219053]: 2025-11-29 06:55:54.76586727 +0000 UTC m=+0.086786158 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:55:55 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:55:55 np0005539504 systemd[219012]: Activating special unit Exit the Session...
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped target Main User Target.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped target Basic System.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped target Paths.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped target Sockets.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped target Timers.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:55 np0005539504 systemd[219012]: Closed D-Bus User Message Bus Socket.
Nov 29 01:55:55 np0005539504 systemd[219012]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:55:55 np0005539504 systemd[219012]: Removed slice User Application Slice.
Nov 29 01:55:55 np0005539504 systemd[219012]: Reached target Shutdown.
Nov 29 01:55:55 np0005539504 systemd[219012]: Finished Exit the Session.
Nov 29 01:55:55 np0005539504 systemd[219012]: Reached target Exit the Session.
Nov 29 01:55:55 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:55:55 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:55:55 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:55:55 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:55:55 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:55:55 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:55:55 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:55:55 np0005539504 nova_compute[187152]: 2025-11-29 06:55:55.734 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:58 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:55:58 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:55:58 np0005539504 systemd-logind[783]: New session 41 of user nova.
Nov 29 01:55:58 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:55:58 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:55:58 np0005539504 systemd[219103]: Queued start job for default target Main User Target.
Nov 29 01:55:58 np0005539504 systemd[219103]: Created slice User Application Slice.
Nov 29 01:55:58 np0005539504 systemd[219103]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:55:58 np0005539504 systemd[219103]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:55:58 np0005539504 systemd[219103]: Reached target Paths.
Nov 29 01:55:58 np0005539504 systemd[219103]: Reached target Timers.
Nov 29 01:55:58 np0005539504 systemd[219103]: Starting D-Bus User Message Bus Socket...
Nov 29 01:55:58 np0005539504 systemd[219103]: Starting Create User's Volatile Files and Directories...
Nov 29 01:55:58 np0005539504 systemd[219103]: Finished Create User's Volatile Files and Directories.
Nov 29 01:55:58 np0005539504 systemd[219103]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:55:58 np0005539504 systemd[219103]: Reached target Sockets.
Nov 29 01:55:58 np0005539504 systemd[219103]: Reached target Basic System.
Nov 29 01:55:58 np0005539504 systemd[219103]: Reached target Main User Target.
Nov 29 01:55:58 np0005539504 systemd[219103]: Startup finished in 143ms.
Nov 29 01:55:58 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:55:58 np0005539504 systemd[1]: Started Session 41 of User nova.
Nov 29 01:55:59 np0005539504 systemd[1]: session-41.scope: Deactivated successfully.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: Session 41 logged out. Waiting for processes to exit.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: Removed session 41.
Nov 29 01:55:59 np0005539504 nova_compute[187152]: 2025-11-29 06:55:59.267 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:59 np0005539504 systemd-logind[783]: New session 43 of user nova.
Nov 29 01:55:59 np0005539504 systemd[1]: Started Session 43 of User nova.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: Session 43 logged out. Waiting for processes to exit.
Nov 29 01:55:59 np0005539504 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: Removed session 43.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: New session 44 of user nova.
Nov 29 01:55:59 np0005539504 systemd[1]: Started Session 44 of User nova.
Nov 29 01:55:59 np0005539504 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: Session 44 logged out. Waiting for processes to exit.
Nov 29 01:55:59 np0005539504 systemd-logind[783]: Removed session 44.
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.296 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.296 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.296 187156 DEBUG nova.network.neutron [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.540 187156 DEBUG nova.network.neutron [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.735 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.858 187156 DEBUG nova.network.neutron [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:00 np0005539504 nova_compute[187152]: 2025-11-29 06:56:00.880 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.055 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.057 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.058 187156 INFO nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Creating image(s)#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.059 187156 DEBUG nova.objects.instance [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.071 187156 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.129 187156 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.130 187156 DEBUG nova.virt.disk.api [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Checking if we can resize image /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.130 187156 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.183 187156 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.184 187156 DEBUG nova.virt.disk.api [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Cannot resize image /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.203 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.203 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Ensure instance console log exists: /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.204 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.204 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.204 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.206 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.211 187156 WARNING nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.220 187156 DEBUG nova.virt.libvirt.host [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.220 187156 DEBUG nova.virt.libvirt.host [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.230 187156 DEBUG nova.virt.libvirt.host [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.230 187156 DEBUG nova.virt.libvirt.host [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.232 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.232 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.233 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.233 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.233 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.234 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.234 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.234 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.234 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.235 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.235 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.235 187156 DEBUG nova.virt.hardware [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.235 187156 DEBUG nova.objects.instance [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.284 187156 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.346 187156 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.347 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.348 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.349 187156 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.352 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <uuid>a7c7d375-ef91-4869-987b-662d0c1de55c</uuid>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <name>instance-00000023</name>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <memory>196608</memory>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:name>tempest-MigrationsAdminTest-server-989129995</nova:name>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:56:01</nova:creationTime>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.micro">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:memory>192</nova:memory>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:        <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <entry name="serial">a7c7d375-ef91-4869-987b-662d0c1de55c</entry>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <entry name="uuid">a7c7d375-ef91-4869-987b-662d0c1de55c</entry>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/console.log" append="off"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:56:01 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:56:01 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:56:01 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:56:01 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.408 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.408 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.409 187156 INFO nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Using config drive#033[00m
Nov 29 01:56:01 np0005539504 systemd-machined[153423]: New machine qemu-19-instance-00000023.
Nov 29 01:56:01 np0005539504 systemd[1]: Started Virtual Machine qemu-19-instance-00000023.
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.789 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399361.78854, a7c7d375-ef91-4869-987b-662d0c1de55c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.791 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.793 187156 DEBUG nova.compute.manager [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.797 187156 INFO nova.virt.libvirt.driver [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance running successfully.#033[00m
Nov 29 01:56:01 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.801 187156 DEBUG nova.virt.libvirt.guest [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.802 187156 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.823 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.833 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.888 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.888 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399361.7898066, a7c7d375-ef91-4869-987b-662d0c1de55c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.889 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] VM Started (Lifecycle Event)#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.955 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:01 np0005539504 nova_compute[187152]: 2025-11-29 06:56:01.959 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:03 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:03Z|00111|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 01:56:03 np0005539504 podman[219166]: 2025-11-29 06:56:03.723756215 +0000 UTC m=+0.066324683 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:56:03 np0005539504 podman[219167]: 2025-11-29 06:56:03.787541487 +0000 UTC m=+0.129270092 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 01:56:04 np0005539504 nova_compute[187152]: 2025-11-29 06:56:04.269 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:05 np0005539504 nova_compute[187152]: 2025-11-29 06:56:05.738 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:08 np0005539504 podman[219212]: 2025-11-29 06:56:08.75864828 +0000 UTC m=+0.085946156 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:56:09 np0005539504 nova_compute[187152]: 2025-11-29 06:56:09.271 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:09 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:56:09 np0005539504 systemd[219103]: Activating special unit Exit the Session...
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped target Main User Target.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped target Basic System.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped target Paths.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped target Sockets.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped target Timers.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:56:09 np0005539504 systemd[219103]: Closed D-Bus User Message Bus Socket.
Nov 29 01:56:09 np0005539504 systemd[219103]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:56:09 np0005539504 systemd[219103]: Removed slice User Application Slice.
Nov 29 01:56:09 np0005539504 systemd[219103]: Reached target Shutdown.
Nov 29 01:56:09 np0005539504 systemd[219103]: Finished Exit the Session.
Nov 29 01:56:09 np0005539504 systemd[219103]: Reached target Exit the Session.
Nov 29 01:56:09 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:56:09 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:56:09 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:56:09 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:56:09 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:56:09 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:56:09 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:56:10 np0005539504 nova_compute[187152]: 2025-11-29 06:56:10.742 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:13 np0005539504 podman[219249]: 2025-11-29 06:56:13.754631018 +0000 UTC m=+0.084125436 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 01:56:14 np0005539504 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:56:14 np0005539504 nova_compute[187152]: 2025-11-29 06:56:14.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:15 np0005539504 nova_compute[187152]: 2025-11-29 06:56:15.745 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:19 np0005539504 nova_compute[187152]: 2025-11-29 06:56:19.276 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:19 np0005539504 nova_compute[187152]: 2025-11-29 06:56:19.944 187156 DEBUG nova.compute.manager [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.103 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.104 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.172 187156 DEBUG nova.objects.instance [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_requests' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.190 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.191 187156 INFO nova.compute.claims [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.192 187156 DEBUG nova.objects.instance [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.208 187156 DEBUG nova.objects.instance [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.267 187156 INFO nova.compute.resource_tracker [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating resource usage from migration b7911716-661a-44a8-8b8d-33fa6a185908#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.268 187156 DEBUG nova.compute.resource_tracker [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Starting to track incoming migration b7911716-661a-44a8-8b8d-33fa6a185908 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.404 187156 DEBUG nova.compute.provider_tree [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.419 187156 DEBUG nova.scheduler.client.report [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.443 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.444 187156 INFO nova.compute.manager [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Migrating#033[00m
Nov 29 01:56:20 np0005539504 nova_compute[187152]: 2025-11-29 06:56:20.747 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:21 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:56:21 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:56:21 np0005539504 systemd-logind[783]: New session 45 of user nova.
Nov 29 01:56:21 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:56:21 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:56:21 np0005539504 podman[219271]: 2025-11-29 06:56:21.843887006 +0000 UTC m=+0.094725864 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 29 01:56:21 np0005539504 systemd[219282]: Queued start job for default target Main User Target.
Nov 29 01:56:21 np0005539504 systemd[219282]: Created slice User Application Slice.
Nov 29 01:56:21 np0005539504 systemd[219282]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:56:21 np0005539504 systemd[219282]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:56:21 np0005539504 systemd[219282]: Reached target Paths.
Nov 29 01:56:21 np0005539504 systemd[219282]: Reached target Timers.
Nov 29 01:56:21 np0005539504 systemd[219282]: Starting D-Bus User Message Bus Socket...
Nov 29 01:56:21 np0005539504 systemd[219282]: Starting Create User's Volatile Files and Directories...
Nov 29 01:56:21 np0005539504 systemd[219282]: Finished Create User's Volatile Files and Directories.
Nov 29 01:56:22 np0005539504 systemd[219282]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:56:22 np0005539504 systemd[219282]: Reached target Sockets.
Nov 29 01:56:22 np0005539504 systemd[219282]: Reached target Basic System.
Nov 29 01:56:22 np0005539504 systemd[219282]: Reached target Main User Target.
Nov 29 01:56:22 np0005539504 systemd[219282]: Startup finished in 168ms.
Nov 29 01:56:22 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:56:22 np0005539504 systemd[1]: Started Session 45 of User nova.
Nov 29 01:56:22 np0005539504 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 01:56:22 np0005539504 systemd-logind[783]: Session 45 logged out. Waiting for processes to exit.
Nov 29 01:56:22 np0005539504 systemd-logind[783]: Removed session 45.
Nov 29 01:56:22 np0005539504 systemd-logind[783]: New session 47 of user nova.
Nov 29 01:56:22 np0005539504 systemd[1]: Started Session 47 of User nova.
Nov 29 01:56:22 np0005539504 systemd[1]: session-47.scope: Deactivated successfully.
Nov 29 01:56:22 np0005539504 systemd-logind[783]: Session 47 logged out. Waiting for processes to exit.
Nov 29 01:56:22 np0005539504 systemd-logind[783]: Removed session 47.
Nov 29 01:56:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:22.912 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:22.912 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:22.913 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:24 np0005539504 nova_compute[187152]: 2025-11-29 06:56:24.278 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:25 np0005539504 podman[219316]: 2025-11-29 06:56:25.734017294 +0000 UTC m=+0.072130741 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:56:25 np0005539504 podman[219317]: 2025-11-29 06:56:25.746631926 +0000 UTC m=+0.082618395 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:56:25 np0005539504 nova_compute[187152]: 2025-11-29 06:56:25.753 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:29 np0005539504 nova_compute[187152]: 2025-11-29 06:56:29.280 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:30 np0005539504 nova_compute[187152]: 2025-11-29 06:56:30.756 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:32 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:56:32 np0005539504 systemd[219282]: Activating special unit Exit the Session...
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped target Main User Target.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped target Basic System.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped target Paths.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped target Sockets.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped target Timers.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:56:32 np0005539504 systemd[219282]: Closed D-Bus User Message Bus Socket.
Nov 29 01:56:32 np0005539504 systemd[219282]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:56:32 np0005539504 systemd[219282]: Removed slice User Application Slice.
Nov 29 01:56:32 np0005539504 systemd[219282]: Reached target Shutdown.
Nov 29 01:56:32 np0005539504 systemd[219282]: Finished Exit the Session.
Nov 29 01:56:32 np0005539504 systemd[219282]: Reached target Exit the Session.
Nov 29 01:56:32 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:56:32 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:56:32 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:56:32 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:56:32 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:56:32 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:56:32 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:56:34 np0005539504 nova_compute[187152]: 2025-11-29 06:56:34.281 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:34 np0005539504 podman[219360]: 2025-11-29 06:56:34.739662523 +0000 UTC m=+0.070025933 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:56:34 np0005539504 podman[219361]: 2025-11-29 06:56:34.774430298 +0000 UTC m=+0.102579457 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:56:35 np0005539504 nova_compute[187152]: 2025-11-29 06:56:35.759 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:36 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:56:36 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:56:36 np0005539504 systemd-logind[783]: New session 48 of user nova.
Nov 29 01:56:36 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:56:36 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:56:36 np0005539504 systemd[219414]: Queued start job for default target Main User Target.
Nov 29 01:56:36 np0005539504 systemd[219414]: Created slice User Application Slice.
Nov 29 01:56:36 np0005539504 systemd[219414]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:56:36 np0005539504 systemd[219414]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:56:36 np0005539504 systemd[219414]: Reached target Paths.
Nov 29 01:56:36 np0005539504 systemd[219414]: Reached target Timers.
Nov 29 01:56:36 np0005539504 systemd[219414]: Starting D-Bus User Message Bus Socket...
Nov 29 01:56:36 np0005539504 systemd[219414]: Starting Create User's Volatile Files and Directories...
Nov 29 01:56:36 np0005539504 systemd[219414]: Finished Create User's Volatile Files and Directories.
Nov 29 01:56:36 np0005539504 systemd[219414]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:56:36 np0005539504 systemd[219414]: Reached target Sockets.
Nov 29 01:56:36 np0005539504 systemd[219414]: Reached target Basic System.
Nov 29 01:56:36 np0005539504 systemd[219414]: Reached target Main User Target.
Nov 29 01:56:36 np0005539504 systemd[219414]: Startup finished in 141ms.
Nov 29 01:56:36 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:56:36 np0005539504 systemd[1]: Started Session 48 of User nova.
Nov 29 01:56:36 np0005539504 systemd-logind[783]: Session 48 logged out. Waiting for processes to exit.
Nov 29 01:56:36 np0005539504 systemd[1]: session-48.scope: Deactivated successfully.
Nov 29 01:56:36 np0005539504 systemd-logind[783]: Removed session 48.
Nov 29 01:56:36 np0005539504 systemd-logind[783]: New session 50 of user nova.
Nov 29 01:56:36 np0005539504 systemd[1]: Started Session 50 of User nova.
Nov 29 01:56:37 np0005539504 systemd[1]: session-50.scope: Deactivated successfully.
Nov 29 01:56:37 np0005539504 systemd-logind[783]: Session 50 logged out. Waiting for processes to exit.
Nov 29 01:56:37 np0005539504 systemd-logind[783]: Removed session 50.
Nov 29 01:56:37 np0005539504 systemd-logind[783]: New session 51 of user nova.
Nov 29 01:56:37 np0005539504 systemd[1]: Started Session 51 of User nova.
Nov 29 01:56:37 np0005539504 systemd[1]: session-51.scope: Deactivated successfully.
Nov 29 01:56:37 np0005539504 systemd-logind[783]: Session 51 logged out. Waiting for processes to exit.
Nov 29 01:56:37 np0005539504 systemd-logind[783]: Removed session 51.
Nov 29 01:56:37 np0005539504 nova_compute[187152]: 2025-11-29 06:56:37.778 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:37 np0005539504 nova_compute[187152]: 2025-11-29 06:56:37.781 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:37 np0005539504 nova_compute[187152]: 2025-11-29 06:56:37.782 187156 DEBUG nova.network.neutron [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:37 np0005539504 nova_compute[187152]: 2025-11-29 06:56:37.946 187156 DEBUG nova.network.neutron [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.466 187156 DEBUG nova.network.neutron [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.480 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.620 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.623 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.624 187156 INFO nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Creating image(s)#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.625 187156 DEBUG nova.objects.instance [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.643 187156 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.725 187156 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.727 187156 DEBUG nova.virt.disk.api [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Checking if we can resize image /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.727 187156 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.819 187156 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.820 187156 DEBUG nova.virt.disk.api [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Cannot resize image /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.838 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.839 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Ensure instance console log exists: /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.840 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.840 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.841 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.844 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.852 187156 WARNING nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.861 187156 DEBUG nova.virt.libvirt.host [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.862 187156 DEBUG nova.virt.libvirt.host [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.866 187156 DEBUG nova.virt.libvirt.host [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.867 187156 DEBUG nova.virt.libvirt.host [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.869 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.869 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.870 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.870 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.871 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.871 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.871 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.871 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.872 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.872 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.872 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.873 187156 DEBUG nova.virt.hardware [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.873 187156 DEBUG nova.objects.instance [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.891 187156 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.988 187156 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.989 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.990 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.991 187156 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:38 np0005539504 nova_compute[187152]: 2025-11-29 06:56:38.995 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <uuid>6aebe65a-3191-4d58-acfd-8d663b9b0a8e</uuid>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <name>instance-00000024</name>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <memory>196608</memory>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:name>tempest-MigrationsAdminTest-server-1402593290</nova:name>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:56:38</nova:creationTime>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.micro">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:memory>192</nova:memory>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:        <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <entry name="serial">6aebe65a-3191-4d58-acfd-8d663b9b0a8e</entry>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <entry name="uuid">6aebe65a-3191-4d58-acfd-8d663b9b0a8e</entry>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:56:38 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:56:38 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/console.log" append="off"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:56:39 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:56:39 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:56:39 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:56:39 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.074 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.075 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.075 187156 INFO nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Using config drive#033[00m
Nov 29 01:56:39 np0005539504 podman[219450]: 2025-11-29 06:56:39.10130135 +0000 UTC m=+0.066556469 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:56:39 np0005539504 systemd-machined[153423]: New machine qemu-20-instance-00000024.
Nov 29 01:56:39 np0005539504 systemd[1]: Started Virtual Machine qemu-20-instance-00000024.
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.283 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.727 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399399.7267988, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.729 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.730 187156 DEBUG nova.compute.manager [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.736 187156 INFO nova.virt.libvirt.driver [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance running successfully.#033[00m
Nov 29 01:56:39 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.741 187156 DEBUG nova.virt.libvirt.guest [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.741 187156 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.765 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.775 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.836 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.837 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399399.7282214, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.837 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Started (Lifecycle Event)#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.858 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:39 np0005539504 nova_compute[187152]: 2025-11-29 06:56:39.863 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.368 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.368 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.386 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.579 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.580 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.587 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.587 187156 INFO nova.compute.claims [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.769 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.771 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.772 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.797 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.880 187156 DEBUG nova.compute.provider_tree [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.891 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.894 187156 DEBUG nova.scheduler.client.report [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.915 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.916 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.919 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.925 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.926 187156 INFO nova.compute.claims [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:40 np0005539504 nova_compute[187152]: 2025-11-29 06:56:40.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.018 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.018 187156 DEBUG nova.network.neutron [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.046 187156 INFO nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.066 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.198 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.199 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.200 187156 INFO nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Creating image(s)#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.201 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "/var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.201 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.202 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.222 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.242 187156 DEBUG nova.policy [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.284 187156 DEBUG nova.compute.provider_tree [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.292 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.292 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.293 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.304 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.325 187156 DEBUG nova.scheduler.client.report [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.348 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.349 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.364 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.364 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.399 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.400 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.400 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.422 187156 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.423 187156 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.423 187156 DEBUG nova.network.neutron [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.427 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.427 187156 DEBUG nova.network.neutron [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.448 187156 INFO nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.458 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.459 187156 DEBUG nova.virt.disk.api [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Checking if we can resize image /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.459 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.475 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.514 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.515 187156 DEBUG nova.virt.disk.api [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Cannot resize image /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.515 187156 DEBUG nova.objects.instance [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 6c18039e-ddd3-49b6-8323-00aca3672fd8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.537 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.538 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Ensure instance console log exists: /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.538 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.538 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.539 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.607 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.608 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.608 187156 INFO nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Creating image(s)#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.609 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "/var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.609 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "/var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.610 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "/var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.624 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.655 187156 DEBUG nova.network.neutron [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.705 187156 DEBUG nova.policy [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.712 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.713 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.714 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.726 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.781 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.783 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.827 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.828 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.829 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.884 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.886 187156 DEBUG nova.virt.disk.api [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Checking if we can resize image /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.886 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.944 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.945 187156 DEBUG nova.virt.disk.api [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Cannot resize image /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.946 187156 DEBUG nova.objects.instance [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'migration_context' on Instance uuid fac6a6d5-8640-43cd-9270-01d80282ca11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.959 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.960 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Ensure instance console log exists: /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.961 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.961 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.961 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.964 187156 DEBUG nova.network.neutron [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:41 np0005539504 nova_compute[187152]: 2025-11-29 06:56:41.985 187156 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.004 187156 DEBUG nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Creating tmpfile /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/tmponlhr7kf to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 29 01:56:42 np0005539504 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000024.scope: Deactivated successfully.
Nov 29 01:56:42 np0005539504 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000024.scope: Consumed 2.856s CPU time.
Nov 29 01:56:42 np0005539504 systemd-machined[153423]: Machine qemu-20-instance-00000024 terminated.
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.153 187156 DEBUG nova.network.neutron [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Successfully created port: 58d29b48-4b4d-4014-93d0-3ecea2472a3e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.257 187156 INFO nova.virt.libvirt.driver [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance destroyed successfully.#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.258 187156 DEBUG nova.objects.instance [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.269 187156 INFO nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Deleting instance files /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_del#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.277 187156 INFO nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Deletion of /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_del complete#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.354 187156 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.355 187156 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.374 187156 DEBUG nova.objects.instance [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.426 187156 DEBUG nova.network.neutron [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Successfully created port: 13869d54-0ea7-412c-b676-18a5cb75a059 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.506 187156 DEBUG nova.compute.provider_tree [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.523 187156 DEBUG nova.scheduler.client.report [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.589 187156 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:56:42 np0005539504 nova_compute[187152]: 2025-11-29 06:56:42.958 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.008 187156 DEBUG nova.network.neutron [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Successfully updated port: 58d29b48-4b4d-4014-93d0-3ecea2472a3e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.028 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.028 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquired lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.028 187156 DEBUG nova.network.neutron [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.142 187156 DEBUG nova.compute.manager [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-changed-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.142 187156 DEBUG nova.compute.manager [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Refreshing instance network info cache due to event network-changed-58d29b48-4b4d-4014-93d0-3ecea2472a3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.142 187156 DEBUG oslo_concurrency.lockutils [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.271 187156 DEBUG nova.network.neutron [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.571 187156 DEBUG nova.network.neutron [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Successfully updated port: 13869d54-0ea7-412c-b676-18a5cb75a059 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.589 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.589 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquired lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.590 187156 DEBUG nova.network.neutron [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.713 187156 DEBUG nova.compute.manager [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-changed-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.713 187156 DEBUG nova.compute.manager [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Refreshing instance network info cache due to event network-changed-13869d54-0ea7-412c-b676-18a5cb75a059. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.714 187156 DEBUG oslo_concurrency.lockutils [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.810 187156 DEBUG nova.network.neutron [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:43 np0005539504 nova_compute[187152]: 2025-11-29 06:56:43.958 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.284 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.383 187156 DEBUG nova.network.neutron [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updating instance_info_cache with network_info: [{"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.404 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Releasing lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.405 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Instance network_info: |[{"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.405 187156 DEBUG oslo_concurrency.lockutils [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.405 187156 DEBUG nova.network.neutron [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Refreshing network info cache for port 58d29b48-4b4d-4014-93d0-3ecea2472a3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.409 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Start _get_guest_xml network_info=[{"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.412 187156 WARNING nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.421 187156 DEBUG nova.virt.libvirt.host [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.422 187156 DEBUG nova.virt.libvirt.host [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.431 187156 DEBUG nova.virt.libvirt.host [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.432 187156 DEBUG nova.virt.libvirt.host [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.433 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.433 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.433 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.434 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.434 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.434 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.434 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.434 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.435 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.435 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.435 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.435 187156 DEBUG nova.virt.hardware [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.440 187156 DEBUG nova.virt.libvirt.vif [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:56:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-2103374378',display_name='tempest-ServerActionsTestOtherA-server-2103374378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-2103374378',id=37,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFRevGxjsP9FW7SRDkRRpVnU4KRGjdG8wIqosLqm4cPhV5Ico8MnUjtpEPPC4qcZ56HVkRqnO8GdidXOybS7oqkeGA17vnM2tYQ+0MqhSrzE91pVD9ZLc4wlyl+Nziiouw==',key_name='tempest-keypair-671368078',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-nnkzgy00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:56:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='812d926ee4ed4159b2e88b7a69990423',uuid=6c18039e-ddd3-49b6-8323-00aca3672fd8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.440 187156 DEBUG nova.network.os_vif_util [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.441 187156 DEBUG nova.network.os_vif_util [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.442 187156 DEBUG nova.objects.instance [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c18039e-ddd3-49b6-8323-00aca3672fd8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.470 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <uuid>6c18039e-ddd3-49b6-8323-00aca3672fd8</uuid>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <name>instance-00000025</name>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestOtherA-server-2103374378</nova:name>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:56:44</nova:creationTime>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:user uuid="812d926ee4ed4159b2e88b7a69990423">tempest-ServerActionsTestOtherA-229564135-project-member</nova:user>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:project uuid="4362be0b90a64d63b2294bbc495486d3">tempest-ServerActionsTestOtherA-229564135</nova:project>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        <nova:port uuid="58d29b48-4b4d-4014-93d0-3ecea2472a3e">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <entry name="serial">6c18039e-ddd3-49b6-8323-00aca3672fd8</entry>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <entry name="uuid">6c18039e-ddd3-49b6-8323-00aca3672fd8</entry>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.config"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:33:5e:37"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <target dev="tap58d29b48-4b"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/console.log" append="off"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:56:44 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:56:44 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:56:44 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:56:44 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.471 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Preparing to wait for external event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.471 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.471 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.471 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.472 187156 DEBUG nova.virt.libvirt.vif [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:56:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-2103374378',display_name='tempest-ServerActionsTestOtherA-server-2103374378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-2103374378',id=37,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFRevGxjsP9FW7SRDkRRpVnU4KRGjdG8wIqosLqm4cPhV5Ico8MnUjtpEPPC4qcZ56HVkRqnO8GdidXOybS7oqkeGA17vnM2tYQ+0MqhSrzE91pVD9ZLc4wlyl+Nziiouw==',key_name='tempest-keypair-671368078',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-nnkzgy00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:56:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='812d926ee4ed4159b2e88b7a69990423',uuid=6c18039e-ddd3-49b6-8323-00aca3672fd8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.472 187156 DEBUG nova.network.os_vif_util [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.473 187156 DEBUG nova.network.os_vif_util [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.473 187156 DEBUG os_vif [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.474 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.474 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.475 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.479 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.480 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58d29b48-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.481 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58d29b48-4b, col_values=(('external_ids', {'iface-id': '58d29b48-4b4d-4014-93d0-3ecea2472a3e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:5e:37', 'vm-uuid': '6c18039e-ddd3-49b6-8323-00aca3672fd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.484 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:44 np0005539504 NetworkManager[55210]: <info>  [1764399404.4858] manager: (tap58d29b48-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.487 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.493 187156 INFO os_vif [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b')#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.550 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.550 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.550 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No VIF found with MAC fa:16:3e:33:5e:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.551 187156 INFO nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Using config drive#033[00m
Nov 29 01:56:44 np0005539504 podman[219550]: 2025-11-29 06:56:44.606761208 +0000 UTC m=+0.071494083 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.973 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 01:56:44 np0005539504 nova_compute[187152]: 2025-11-29 06:56:44.973 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.080 187156 INFO nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Creating config drive at /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.config#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.089 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfrh4o_dt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.112 187156 DEBUG nova.network.neutron [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updating instance_info_cache with network_info: [{"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.130 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.131 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.131 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.131 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.137 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Releasing lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.137 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Instance network_info: |[{"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.138 187156 DEBUG oslo_concurrency.lockutils [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.139 187156 DEBUG nova.network.neutron [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Refreshing network info cache for port 13869d54-0ea7-412c-b676-18a5cb75a059 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.142 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Start _get_guest_xml network_info=[{"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.153 187156 WARNING nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.161 187156 DEBUG nova.virt.libvirt.host [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.162 187156 DEBUG nova.virt.libvirt.host [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.165 187156 DEBUG nova.virt.libvirt.host [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.165 187156 DEBUG nova.virt.libvirt.host [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.166 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.166 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.167 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.167 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.167 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.167 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.168 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.168 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.168 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.168 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.168 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.169 187156 DEBUG nova.virt.hardware [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.172 187156 DEBUG nova.virt.libvirt.vif [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-419063069',display_name='tempest-SecurityGroupsTestJSON-server-419063069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-419063069',id=38,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-70qqewkf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:56:41Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=fac6a6d5-8640-43cd-9270-01d80282ca11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.172 187156 DEBUG nova.network.os_vif_util [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.173 187156 DEBUG nova.network.os_vif_util [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.174 187156 DEBUG nova.objects.instance [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'pci_devices' on Instance uuid fac6a6d5-8640-43cd-9270-01d80282ca11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.187 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <uuid>fac6a6d5-8640-43cd-9270-01d80282ca11</uuid>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <name>instance-00000026</name>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:name>tempest-SecurityGroupsTestJSON-server-419063069</nova:name>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:56:45</nova:creationTime>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:user uuid="b509e6a04cd147779a714856e3cd95ab">tempest-SecurityGroupsTestJSON-1623643057-project-member</nova:user>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:project uuid="32234968781646cf869d42134e62b91c">tempest-SecurityGroupsTestJSON-1623643057</nova:project>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        <nova:port uuid="13869d54-0ea7-412c-b676-18a5cb75a059">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <entry name="serial">fac6a6d5-8640-43cd-9270-01d80282ca11</entry>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <entry name="uuid">fac6a6d5-8640-43cd-9270-01d80282ca11</entry>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.config"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:0f:43:95"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <target dev="tap13869d54-0e"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </interface>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/console.log" append="off"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:56:45 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:56:45 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:56:45 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:56:45 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.189 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Preparing to wait for external event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.189 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.189 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.189 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.190 187156 DEBUG nova.virt.libvirt.vif [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-419063069',display_name='tempest-SecurityGroupsTestJSON-server-419063069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-419063069',id=38,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-70qqewkf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:56:41Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=fac6a6d5-8640-43cd-9270-01d80282ca11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.190 187156 DEBUG nova.network.os_vif_util [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.191 187156 DEBUG nova.network.os_vif_util [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.191 187156 DEBUG os_vif [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.192 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.192 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.193 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.195 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.196 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13869d54-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.196 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13869d54-0e, col_values=(('external_ids', {'iface-id': '13869d54-0ea7-412c-b676-18a5cb75a059', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:43:95', 'vm-uuid': 'fac6a6d5-8640-43cd-9270-01d80282ca11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.198 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.1994] manager: (tap13869d54-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.200 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.212 187156 INFO os_vif [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e')#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.216 187156 DEBUG oslo_concurrency.processutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfrh4o_dt" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.2796] manager: (tap58d29b48-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Nov 29 01:56:45 np0005539504 kernel: tap58d29b48-4b: entered promiscuous mode
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.284 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:45Z|00112|binding|INFO|Claiming lport 58d29b48-4b4d-4014-93d0-3ecea2472a3e for this chassis.
Nov 29 01:56:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:45Z|00113|binding|INFO|58d29b48-4b4d-4014-93d0-3ecea2472a3e: Claiming fa:16:3e:33:5e:37 10.100.0.6
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.291 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.291 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.292 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] No VIF found with MAC fa:16:3e:0f:43:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.292 187156 INFO nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Using config drive#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.299 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:5e:37 10.100.0.6'], port_security=['fa:16:3e:33:5e:37 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09ce75c2-edf6-4d0e-b148-55edc758c529', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=58d29b48-4b4d-4014-93d0-3ecea2472a3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.300 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 58d29b48-4b4d-4014-93d0-3ecea2472a3e in datapath db691b6b-17b7-42a9-9fd2-162233da0513 bound to our chassis#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.302 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network db691b6b-17b7-42a9-9fd2-162233da0513#033[00m
Nov 29 01:56:45 np0005539504 systemd-udevd[219593]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.314 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[72440ce5-e6bb-48ce-8866-83de129ac332]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.315 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdb691b6b-11 in ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.318 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdb691b6b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.319 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4263b3d9-e0d0-43fb-b496-5b1af781d88e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.319 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[45248948-18ce-4714-9717-beb1b1af9963]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.3245] device (tap58d29b48-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.3254] device (tap58d29b48-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:56:45 np0005539504 systemd-machined[153423]: New machine qemu-21-instance-00000025.
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.333 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[47c62927-3896-446c-81ba-e6048e2d476e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.336 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.348 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 systemd[1]: Started Virtual Machine qemu-21-instance-00000025.
Nov 29 01:56:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:45Z|00114|binding|INFO|Setting lport 58d29b48-4b4d-4014-93d0-3ecea2472a3e ovn-installed in OVS
Nov 29 01:56:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:45Z|00115|binding|INFO|Setting lport 58d29b48-4b4d-4014-93d0-3ecea2472a3e up in Southbound
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.355 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.357 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[29a7164b-ca55-4c2d-bcbe-8bd8addad880]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.396 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d17715-0e51-402d-9686-466b0820736c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.4060] manager: (tapdb691b6b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Nov 29 01:56:45 np0005539504 systemd-udevd[219597]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.405 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a76b537a-b716-432a-a3fe-5e6a25700521]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.443 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a4890ba1-cefe-416c-b1bf-ce62c041c220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.447 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[545c093a-a4d7-4ab2-be88-9ee49769f7e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.4741] device (tapdb691b6b-10): carrier: link connected
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.480 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[81154c26-6f51-475a-bc62-5fe576a67c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.503 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[37d1ac49-2e5e-4f89-8fa5-f9713fb63f77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486835, 'reachable_time': 30212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219632, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.523 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b239d694-9e9c-4ed0-95cc-610a20d2e4a7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:ad90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486835, 'tstamp': 486835}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219640, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.548 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4dae08bd-5cfa-47d1-a064-e9617e603f7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486835, 'reachable_time': 30212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219641, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.584 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[63cca2df-e17c-42c0-9888-c8e036685a17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.593 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399405.5928404, 6c18039e-ddd3-49b6-8323-00aca3672fd8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.594 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] VM Started (Lifecycle Event)#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.618 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.623 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399405.5959964, 6c18039e-ddd3-49b6-8323-00aca3672fd8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.623 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.641 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.644 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.654 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0842094d-79f4-432a-acd2-2d28b693934d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.656 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.657 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.657 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb691b6b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.659 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 NetworkManager[55210]: <info>  [1764399405.6601] manager: (tapdb691b6b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Nov 29 01:56:45 np0005539504 kernel: tapdb691b6b-10: entered promiscuous mode
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.663 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.664 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.665 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdb691b6b-10, col_values=(('external_ids', {'iface-id': '4035feb9-29a5-4ae9-8490-a44f1379821c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.666 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:45Z|00116|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.679 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.684 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.685 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.686 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a2f201-e5cc-44d0-811b-72a95c0ce723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.687 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.687 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'env', 'PROCESS_TAG=haproxy-db691b6b-17b7-42a9-9fd2-162233da0513', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/db691b6b-17b7-42a9-9fd2-162233da0513.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.781 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:45.781 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.932 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.948 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.948 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.948 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.954 187156 INFO nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Creating config drive at /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.config#033[00m
Nov 29 01:56:45 np0005539504 nova_compute[187152]: 2025-11-29 06:56:45.959 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_mmc__b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.003 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.004 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.004 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.004 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.044 187156 DEBUG nova.compute.manager [req-ea97b098-17ca-4102-9500-a5836c93f635 req-19480a1d-b30b-4e72-b6d4-963f5beac571 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.045 187156 DEBUG oslo_concurrency.lockutils [req-ea97b098-17ca-4102-9500-a5836c93f635 req-19480a1d-b30b-4e72-b6d4-963f5beac571 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.045 187156 DEBUG oslo_concurrency.lockutils [req-ea97b098-17ca-4102-9500-a5836c93f635 req-19480a1d-b30b-4e72-b6d4-963f5beac571 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.045 187156 DEBUG oslo_concurrency.lockutils [req-ea97b098-17ca-4102-9500-a5836c93f635 req-19480a1d-b30b-4e72-b6d4-963f5beac571 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.045 187156 DEBUG nova.compute.manager [req-ea97b098-17ca-4102-9500-a5836c93f635 req-19480a1d-b30b-4e72-b6d4-963f5beac571 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Processing event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.046 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.050 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399406.050338, 6c18039e-ddd3-49b6-8323-00aca3672fd8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.051 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.053 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.056 187156 INFO nova.virt.libvirt.driver [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Instance spawned successfully.#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.057 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.087 187156 DEBUG oslo_concurrency.processutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_mmc__b" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.101 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.113 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.118 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.119 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.119 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.120 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.120 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.121 187156 DEBUG nova.virt.libvirt.driver [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:46 np0005539504 podman[219682]: 2025-11-29 06:56:46.124756605 +0000 UTC m=+0.060303659 container create fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 01:56:46 np0005539504 NetworkManager[55210]: <info>  [1764399406.1422] manager: (tap13869d54-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Nov 29 01:56:46 np0005539504 kernel: tap13869d54-0e: entered promiscuous mode
Nov 29 01:56:46 np0005539504 systemd-udevd[219620]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:56:46 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:46Z|00117|binding|INFO|Claiming lport 13869d54-0ea7-412c-b676-18a5cb75a059 for this chassis.
Nov 29 01:56:46 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:46Z|00118|binding|INFO|13869d54-0ea7-412c-b676-18a5cb75a059: Claiming fa:16:3e:0f:43:95 10.100.0.8
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.140 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 NetworkManager[55210]: <info>  [1764399406.1571] device (tap13869d54-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:56:46 np0005539504 NetworkManager[55210]: <info>  [1764399406.1580] device (tap13869d54-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.160 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:43:95 10.100.0.8'], port_security=['fa:16:3e:0f:43:95 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09d1caf9-4b04-433c-8535-2cd6d44437db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32234968781646cf869d42134e62b91c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1fb16d9c-331a-41c7-a6da-9b1479dbf50c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d714e3-e817-4d0b-99e9-4cc2314001af, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=13869d54-0ea7-412c-b676-18a5cb75a059) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.177 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.179 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:56:46 np0005539504 podman[219682]: 2025-11-29 06:56:46.093183777 +0000 UTC m=+0.028730851 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.204 187156 INFO nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Took 5.01 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.205 187156 DEBUG nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.211 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:46 np0005539504 systemd[1]: Started libpod-conmon-fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb.scope.
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.221 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:46Z|00119|binding|INFO|Setting lport 13869d54-0ea7-412c-b676-18a5cb75a059 ovn-installed in OVS
Nov 29 01:56:46 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:46Z|00120|binding|INFO|Setting lport 13869d54-0ea7-412c-b676-18a5cb75a059 up in Southbound
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.223 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.257 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:46 np0005539504 systemd-machined[153423]: New machine qemu-22-instance-00000026.
Nov 29 01:56:46 np0005539504 systemd[1]: Started Virtual Machine qemu-22-instance-00000026.
Nov 29 01:56:46 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/832f21a6ed560dce8d91edbc256c26ac563e750278627b35104fd2ef2522adc1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:56:46 np0005539504 podman[219682]: 2025-11-29 06:56:46.277514455 +0000 UTC m=+0.213061529 container init fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:56:46 np0005539504 podman[219682]: 2025-11-29 06:56:46.288125162 +0000 UTC m=+0.223672236 container start fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.307 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [NOTICE]   (219724) : New worker (219730) forked
Nov 29 01:56:46 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [NOTICE]   (219724) : Loading success.
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.334 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.363 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.363 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 13869d54-0ea7-412c-b676-18a5cb75a059 in datapath 09d1caf9-4b04-433c-8535-2cd6d44437db unbound from our chassis#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.365 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09d1caf9-4b04-433c-8535-2cd6d44437db#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.375 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[76a71ce6-000f-4181-a24a-860ca1ebd3dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.376 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09d1caf9-41 in ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.379 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09d1caf9-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.379 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6e41e2-b89e-49f6-8806-a55a360bf94f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.380 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0abae229-6ff4-4152-a502-58be2b8756eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.387 187156 DEBUG nova.network.neutron [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updated VIF entry in instance network info cache for port 58d29b48-4b4d-4014-93d0-3ecea2472a3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.388 187156 DEBUG nova.network.neutron [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updating instance_info_cache with network_info: [{"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.397 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba91145-f72c-43bb-8acb-54df0c2535e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.412 187156 DEBUG oslo_concurrency.lockutils [req-b3af1002-ba05-4929-a9f9-319b0ed741e5 req-76a0c430-0fd2-4f45-b155-b19a4b00e91c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.416 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.417 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.425 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb3a2e0-4636-4c53-9af7-ee493f065166]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.460 187156 INFO nova.compute.manager [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Took 6.00 seconds to build instance.#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.464 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[05f43063-c545-4612-96f0-0eb474c68960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 NetworkManager[55210]: <info>  [1764399406.4733] manager: (tap09d1caf9-40): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.475 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[001abba5-7b1c-4c95-a224-0a3c41190afc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.487 187156 DEBUG oslo_concurrency.lockutils [None req-61746322-badc-4616-8379-93ad6cb9cd9d 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.495 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.501 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.519 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f2182229-a38a-4373-b3c3-ad2593cdd8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.533 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[912ac292-5129-4286-96bb-3aef5a44a639]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 NetworkManager[55210]: <info>  [1764399406.5644] device (tap09d1caf9-40): carrier: link connected
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.572 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c962d0ae-7352-4f02-ab15-096ff2bfe322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.579 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.580 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.594 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbdd2f3-e5af-46b7-b62e-fb4ae5a0615c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09d1caf9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486944, 'reachable_time': 26070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219760, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.610 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f57f99-80b0-481b-ae9d-89c20ba38f10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea3:d9b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486944, 'tstamp': 486944}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219762, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.629 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[01a1b8ae-2d63-4766-a4a1-215c2c654eb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09d1caf9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486944, 'reachable_time': 26070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219763, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.646 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.661 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.662 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d1de69-d9c6-4019-86fd-81f288418e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.737 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[714209fe-7983-4357-9185-1ed3d2b3a42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.739 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d1caf9-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.739 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.739 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09d1caf9-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.740 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 NetworkManager[55210]: <info>  [1764399406.7426] manager: (tap09d1caf9-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Nov 29 01:56:46 np0005539504 kernel: tap09d1caf9-40: entered promiscuous mode
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.743 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.744 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09d1caf9-40, col_values=(('external_ids', {'iface-id': '4919a746-8c41-4e1b-93e2-17dfe2a5b063'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:46 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:46Z|00121|binding|INFO|Releasing lport 4919a746-8c41-4e1b-93e2-17dfe2a5b063 from this chassis (sb_readonly=0)
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.760 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.761 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[59e8454c-5e8a-44b4-8476-03d9d4c7a1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.761 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-09d1caf9-4b04-433c-8535-2cd6d44437db
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 09d1caf9-4b04-433c-8535-2cd6d44437db
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:56:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:46.762 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'env', 'PROCESS_TAG=haproxy-09d1caf9-4b04-433c-8535-2cd6d44437db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09d1caf9-4b04-433c-8535-2cd6d44437db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.770 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.812 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.847 187156 DEBUG nova.compute.manager [req-85463d37-ea79-4e37-89a7-44fa742138d6 req-b7854e50-9c8c-4086-ac80-d43734cc8a04 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.853 187156 DEBUG oslo_concurrency.lockutils [req-85463d37-ea79-4e37-89a7-44fa742138d6 req-b7854e50-9c8c-4086-ac80-d43734cc8a04 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.854 187156 DEBUG oslo_concurrency.lockutils [req-85463d37-ea79-4e37-89a7-44fa742138d6 req-b7854e50-9c8c-4086-ac80-d43734cc8a04 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.854 187156 DEBUG oslo_concurrency.lockutils [req-85463d37-ea79-4e37-89a7-44fa742138d6 req-b7854e50-9c8c-4086-ac80-d43734cc8a04 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.855 187156 DEBUG nova.compute.manager [req-85463d37-ea79-4e37-89a7-44fa742138d6 req-b7854e50-9c8c-4086-ac80-d43734cc8a04 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Processing event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.968 187156 DEBUG nova.network.neutron [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updated VIF entry in instance network info cache for port 13869d54-0ea7-412c-b676-18a5cb75a059. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.970 187156 DEBUG nova.network.neutron [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updating instance_info_cache with network_info: [{"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:46 np0005539504 nova_compute[187152]: 2025-11-29 06:56:46.993 187156 DEBUG oslo_concurrency.lockutils [req-b97bacea-e79a-4b27-865e-0739d9b034c9 req-4999260d-6631-4e1d-aa4c-2916e5281fe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.076 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.078 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5351MB free_disk=73.14259719848633GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.078 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.078 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.153 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 66b9235f-7cc8-40d4-877b-b690613298a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.153 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance a7c7d375-ef91-4869-987b-662d0c1de55c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.153 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 6c18039e-ddd3-49b6-8323-00aca3672fd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.154 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance fac6a6d5-8640-43cd-9270-01d80282ca11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.154 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.154 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=1088MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:56:47 np0005539504 podman[219806]: 2025-11-29 06:56:47.216920764 +0000 UTC m=+0.053671899 container create 755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.256 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:47 np0005539504 systemd[1]: Started libpod-conmon-755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af.scope.
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.263 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399407.2558234, fac6a6d5-8640-43cd-9270-01d80282ca11 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.264 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] VM Started (Lifecycle Event)#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.266 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.277 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.280 187156 INFO nova.virt.libvirt.driver [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Instance spawned successfully.#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.281 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:56:47 np0005539504 podman[219806]: 2025-11-29 06:56:47.188988945 +0000 UTC m=+0.025740160 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.289 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.294 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.301 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.305 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.305 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.306 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.306 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.306 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.307 187156 DEBUG nova.virt.libvirt.driver [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:47 np0005539504 systemd[1]: Started libcrun container.
Nov 29 01:56:47 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b52f9026e97650dbc64c1bfe6bd16aaf25e29416fe7f291f7f6018e6d018709a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.322 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.323 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.324 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.324 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:56:47 np0005539504 podman[219806]: 2025-11-29 06:56:47.333898421 +0000 UTC m=+0.170649596 container init 755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.341 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.341 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399407.256222, fac6a6d5-8640-43cd-9270-01d80282ca11 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.341 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:56:47 np0005539504 podman[219806]: 2025-11-29 06:56:47.342874446 +0000 UTC m=+0.179625591 container start 755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.344 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.373 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.377 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399407.269476, fac6a6d5-8640-43cd-9270-01d80282ca11 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.377 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:56:47 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [NOTICE]   (219830) : New worker (219832) forked
Nov 29 01:56:47 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [NOTICE]   (219830) : Loading success.
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.406 187156 INFO nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Took 5.80 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.406 187156 DEBUG nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.409 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.420 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.459 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.519 187156 INFO nova.compute.manager [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Took 6.66 seconds to build instance.#033[00m
Nov 29 01:56:47 np0005539504 nova_compute[187152]: 2025-11-29 06:56:47.566 187156 DEBUG oslo_concurrency.lockutils [None req-573780d6-72fb-42ce-96fb-8aec0e67772e b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:47 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:56:47 np0005539504 systemd[219414]: Activating special unit Exit the Session...
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped target Main User Target.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped target Basic System.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped target Paths.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped target Sockets.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped target Timers.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:56:47 np0005539504 systemd[219414]: Closed D-Bus User Message Bus Socket.
Nov 29 01:56:47 np0005539504 systemd[219414]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:56:47 np0005539504 systemd[219414]: Removed slice User Application Slice.
Nov 29 01:56:47 np0005539504 systemd[219414]: Reached target Shutdown.
Nov 29 01:56:47 np0005539504 systemd[219414]: Finished Exit the Session.
Nov 29 01:56:47 np0005539504 systemd[219414]: Reached target Exit the Session.
Nov 29 01:56:47 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:56:47 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:56:47 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:56:47 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:56:47 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:56:47 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:56:47 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:47.961 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4362be0b90a64d63b2294bbc495486d3', 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'hostId': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:47.986 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}42b7a977f68c13f5f34ed9a4a6321013432cd7e1ab36f19fb3a3541c74bf8d1b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.245 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 06:56:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-309445ca-8ffa-45a2-9d2e-77e081ae9fa1 x-openstack-request-id: req-309445ca-8ffa-45a2-9d2e-77e081ae9fa1 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.246 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.246 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-309445ca-8ffa-45a2-9d2e-77e081ae9fa1 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.250 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}42b7a977f68c13f5f34ed9a4a6321013432cd7e1ab36f19fb3a3541c74bf8d1b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.331 187156 DEBUG nova.compute.manager [req-0a0827e2-b1be-41ea-9db1-40f23a699e5a req-57a49dd5-bb13-4963-ba28-d07280d93c9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.332 187156 DEBUG oslo_concurrency.lockutils [req-0a0827e2-b1be-41ea-9db1-40f23a699e5a req-57a49dd5-bb13-4963-ba28-d07280d93c9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.333 187156 DEBUG oslo_concurrency.lockutils [req-0a0827e2-b1be-41ea-9db1-40f23a699e5a req-57a49dd5-bb13-4963-ba28-d07280d93c9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.333 187156 DEBUG oslo_concurrency.lockutils [req-0a0827e2-b1be-41ea-9db1-40f23a699e5a req-57a49dd5-bb13-4963-ba28-d07280d93c9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.334 187156 DEBUG nova.compute.manager [req-0a0827e2-b1be-41ea-9db1-40f23a699e5a req-57a49dd5-bb13-4963-ba28-d07280d93c9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] No waiting events found dispatching network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.334 187156 WARNING nova.compute.manager [req-0a0827e2-b1be-41ea-9db1-40f23a699e5a req-57a49dd5-bb13-4963-ba28-d07280d93c9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received unexpected event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e for instance with vm_state active and task_state None.#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.363 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.363 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.363 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.390 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Sat, 29 Nov 2025 06:56:48 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2fef3f30-7435-4b43-ad50-cafa2be88063 x-openstack-request-id: req-2fef3f30-7435-4b43-ad50-cafa2be88063 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.391 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.391 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f used request id req-2fef3f30-7435-4b43-ad50-cafa2be88063 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.394 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'name': 'tempest-MigrationsAdminTest-server-989129995', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000023', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '890f94a625b342fdb17128922403c925', 'user_id': '53ee944c04484336b9b14d84235a62b8', 'hostId': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.398 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'name': 'tempest-MigrationsAdminTest-server-2086906237', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '890f94a625b342fdb17128922403c925', 'user_id': '53ee944c04484336b9b14d84235a62b8', 'hostId': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.400 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000026', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '32234968781646cf869d42134e62b91c', 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'hostId': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.401 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.401 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.402 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>]
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.402 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.407 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6c18039e-ddd3-49b6-8323-00aca3672fd8 / tap58d29b48-4b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.407 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.414 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fac6a6d5-8640-43cd-9270-01d80282ca11 / tap13869d54-0e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.415 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17abf3e9-e3e2-4e50-b966-7d5904b9d6dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.402832', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '934e4010-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': '75de2c29d6523d6435b023386b311b8d1252f7ebed14a251c97275eedd738118'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.402832', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '934f5540-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': '686d24a2a7b000608fe24f2c050cb72e39d69a448824587aaa78149608505b74'}]}, 'timestamp': '2025-11-29 06:56:48.415561', '_unique_id': 'ac8dbd1864114e2c98ed650ee40cbc28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.418 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.420 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.447 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.read.latency volume: 225877174 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.447 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.read.latency volume: 508913 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.476 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.read.latency volume: 250876100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.477 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.read.latency volume: 23072856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.507 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.read.latency volume: 354559118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.517 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.read.latency volume: 23304617 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.543 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.read.latency volume: 144395464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.543 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.read.latency volume: 650498 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbeddef6-db70-494b-8bdf-d8537f996e06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 225877174, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '935446fe-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '79e673c9c671a629390e272ee1d61ae85ee0db7f5467ebe6626ba0dadcbefebe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 508913, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9354590a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '3ebafbcfa65da8a71786c70096ae4704e1c561b31def3a5c92b073e17288bc81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 250876100, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9358ca08-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '0ea2183efd37da9fb0c0bfbe699427ea8c877337cc5a9cbf540b426fa1a9d12f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23072856, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9358de76-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': 'cc766d1eb30087736bbd79884525c7161391badd7aee5b2dd9c7940326c2620d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 354559118, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '935d873c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '218d9a71b27d64a921567ff8e4e3488725aa0f3adc42ac7a09be8835f5a16ec8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23304617, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '935eed5c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': 'f919904b7c9ea43c567a45e8ad64895b46cf55f72ed8813165936e6c1b5cc080'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 144395464, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9362ebf0-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '6f3493ba6028016221ad33f4fc66b642ce9bafaedf715296b767c37fc6e81eeb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 650498, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.420195', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9362fcbc-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '612f9761ef608ed81743f4e9c12d4487c35cfa3d4019e6d2b8027fe36419b091'}]}, 'timestamp': '2025-11-29 06:56:48.544266', '_unique_id': 'b6b5ea6f6de647b1b759d210c542da0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.558 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.559 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.567 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.568 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.576 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.577 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.586 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.587 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18641aa1-8375-457d-b46c-176e6e827500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9365442c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.481789761, 'message_signature': '09a2a9cd51ef7ab5dd8ad716d52b29875ca2f2fb8168f46c5f726b4439ed714f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93655034-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.481789761, 'message_signature': '1e5a0d2a4180188a93be8e3e5e545b7abce1d0101cb82dbdf79bd0352484fbe2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9366ac22-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.494161997, 'message_signature': 'a9b3d3dcad93c2ae62127f22dd3dad6ea53e43dfec4624348ada19520d785d98'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9366b866-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.494161997, 'message_signature': '436fe5eda36ad0c4ec40156970a0aafbae3aa1c03d434ad5bddc6527ff1c03e3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9368057c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.503322785, 'message_signature': 'c8ffd180f12633ea604fc9d8beb9363610216609108297e99e4df4de71920257'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'me
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: '936810b2-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.503322785, 'message_signature': 'ed41b84733ebf36908b9a65efd89c6250e2ef60725c84481cadd2e58cd809364'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93699176-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.512151916, 'message_signature': '66dfc64d924d7215a9edc1cd9e96c172462ae08f7c52674a8fed46a712452ce8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.547082', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9369a012-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.512151916, 'message_signature': 'ec425c600903d4fa0b227fa8d4abbe40f5aab9c7e219e44d6268a334f34d5fba'}]}, 'timestamp': '2025-11-29 06:56:48.587732', '_unique_id': '633be5c0d3804660a0e1e697ca9d1001'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.590 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.590 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.590 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65461bc7-0588-4146-b6ab-af0424d8b706', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.590392', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '936a166e-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': '95a4e88e03ae5e04efbd6da6d03263d5bb412f6b7760816be49992deb0751415'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.590392', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '936a2280-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': '69b8b5626e4a5ef3a3f32eaed1b6477dc6b3c89694e4fb92b28ccf8c9e628d4b'}]}, 'timestamp': '2025-11-29 06:56:48.591069', '_unique_id': 'bcb19739e44e4002bc49b48068471afb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.591 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.592 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.592 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.593 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.593 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.read.requests volume: 1206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.593 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.594 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.read.requests volume: 1206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.594 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.594 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.594 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee6b6b48-477f-4996-b87c-acd86e369f37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936a73e8-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '7d5e3bb196bceb62fbfaba484d2c1448bec0924a6312ea16303fe6c8da5ad0f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936a8158-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': 'c835a18784a8a85b54baf6039a8e521639fa91a55ab7408c886f28fb2feb2f11'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1206, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936a8dba-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '8ad80d9d864f0f380c7faa8045e627c8023ee0476d2d9e16e530d6380f9016e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936a98e6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': 'fbecaa9f5a2a4c21509b3d50be44c45ae692654fd721c3268c01912b3e9aa1a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1206, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936aa5c0-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '810db99f9fc28087cb527facfc10a22c4c2aac8b549427f8c6b12249bb408d9b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6dd
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: _mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936ab1aa-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '9efbfd8bf62dc2b4e945c03c82c93ce13dd6b8b110b2e5b8411975045ca0ab17'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936abc54-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': 'd4ff013ed0d42173e946f7f7b7c3792516a63b02d07eb76b2d35045a1ffc18c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.592838', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936ac69a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': 'ac0ba6f4bc7d7ed245043d598559646a0a2273899aca6ca39ab9d9f821a032eb'}]}, 'timestamp': '2025-11-29 06:56:48.595310', '_unique_id': '477dee8305194c54b958fdb3a6ced269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.597 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.597 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.597 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.allocation volume: 30085120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.598 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.598 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.allocation volume: 30085120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.598 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.598 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.599 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1fc864e-23b9-4cd7-94c6-22130ccad202', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936b1b68-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.481789761, 'message_signature': 'a14c61f44f35144e4eaee5a200f0e4b89f174f0e626c849762909ed7cb9a7c0e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936b2a68-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.481789761, 'message_signature': '067560b6cf6c0bd5bfa69f2f85f911add761b5b1a482ab6bdd67695c7fa4d6a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30085120, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936b3512-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.494161997, 'message_signature': 'cd9422a622317c211520917eb8b096d5cc92ff8ec8caf0d11de490968bd4034e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936b3f76-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.494161997, 'message_signature': '3b46c906b4162c60506447a8406782eb16c96ba02823fc62bb2722c5193bf034'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30085120, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936b4d90-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.503322785, 'message_signature': 'b0fa7afdff445717e8addabb4f22ca48211178a16735e7d453a6ada08d81d84c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: k_name': 'sda'}, 'message_id': '936b586c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.503322785, 'message_signature': '7a7a48330dd37a941cec3e3a3ec1d3f0be271667dc0a06811c7bee5a68afaa9f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '936b6320-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.512151916, 'message_signature': 'fbe07d801c342d60912d4e7cf2f16780ccc67535d237aceed97d82d4ed6b5f46'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.597122', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '936b6fd2-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.512151916, 'message_signature': '524fea35493ab8cf41dec86606546dbaba3b04855ac0660fc0174ce266d0b33d'}]}, 'timestamp': '2025-11-29 06:56:48.599597', '_unique_id': '74126e9ddaac42b9bfeb5131451d59d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.601 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.601 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>]
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.602 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.602 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aed1048-617f-4556-b191-c660ee8ec2cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.602020', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '936bdab2-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': '9ac330b7592ee7dafa05c3d0e088bcbc7c7f40652f18142cdf135828c62dda80'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.602020', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '936be994-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': '0431125e286cb8abfa0c8a7059f6f67f0d2c8c161e00efe06c0a8f5c630be9d2'}]}, 'timestamp': '2025-11-29 06:56:48.602729', '_unique_id': 'afd1054a641941e28525b7bc7917edba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.603 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.604 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.624 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.624 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 6c18039e-ddd3-49b6-8323-00aca3672fd8: ceilometer.compute.pollsters.NoVolumeException
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.639 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/memory.usage volume: 43.05078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.655 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/memory.usage volume: 40.62890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.671 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.672 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance fac6a6d5-8640-43cd-9270-01d80282ca11: ceilometer.compute.pollsters.NoVolumeException
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '548885d2-263f-4206-b359-a0653aca2955', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.05078125, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'timestamp': '2025-11-29T06:56:48.604535', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '93719088-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.573853302, 'message_signature': 'ed99c39c32c1170247b72653e759cb413dbd7a5eedf19963af983b5e475afdc0'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.62890625, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'timestamp': '2025-11-29T06:56:48.604535', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '937402be-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.589530467, 'message_signature': 'e622e82e30ea3faa1a89a73b6b9bf0ef80281c0e892fa0f353364198683b2855'}]}, 'timestamp': '2025-11-29 06:56:48.672612', '_unique_id': 'df3b6ec643324f62a5aa3e15914f478e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.674 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.675 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.676 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64f02fc2-3e40-487e-8c33-f7b5e86691c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.675657', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '93771a80-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': 'b153e15f23da2c9b8644b9cd1b46bd7a0a1b1b7b684227e969bce5fc4a0192f6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.675657', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '93772c32-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': 'e45f3834dd52896da98c9b98506872723b287d07aeb303c0a9f6b5cf379621da'}]}, 'timestamp': '2025-11-29 06:56:48.676549', '_unique_id': 'dfb0d45e778f42b692f8644240f1da41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.677 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.678 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.679 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.679 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.write.latency volume: 23542109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.679 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.680 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.write.latency volume: 41837395 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.680 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.681 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.681 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '570375ff-6346-4ce5-91c7-eb32cd1975eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937790dc-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '9f84c60b0cc43337e794af8125342e72dfb7f0fa40b41a0d821fc7b50fc72105'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93779fa0-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '697ffaabcdf41897493966450cc114b7d611cad7d85b40bc505b606dc2cabc47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23542109, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9377af40-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '5482ddf0772ec665da520676dba846eefb3c5d0b7d1e74c3c2b933683e6455ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9377bbb6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '1ee60e25c01e207eb553e037842382374f58e09313f778cec255fa94c72fbd19'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41837395, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9377ca20-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '119d030f6850ea32c83a71f7e6cb570e7f0f44edc6b5a0e8bf884ad3bb0de9d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': N
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: hemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9377e190-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '6814c787fb0ec28275459944a0ff1e007d1243f2bebe561e48e8ed9844d925e5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9377ee7e-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': 'ed0b7311670c2b3957c4e21a045df621528d65f721ee6ce3f4a71f57437e9763'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.678745', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93780008-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': 'ed3b5db7316293605680c5a1b894dafea71488170946ee42ea2ec63ef407e71e'}]}, 'timestamp': '2025-11-29 06:56:48.681933', '_unique_id': '1b820d54b159434ba7a54bdee8908b59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.684 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.684 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.685 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.read.bytes volume: 32020480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.685 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.686 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.read.bytes volume: 32020480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.686 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.686 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.687 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc51c9e1-8c1b-4625-93e8-846f22cea561', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93786fd4-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '52110b47d199205117cb84a20f1f9896464edec64d3da239aa8be1447a5d3b91'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93788316-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': 'dad562c4ab807d0e81e10e6d195f7799833e218d577b228b279acef046e6d7f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32020480, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93789040-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': 'c497f5fff5d33cd3a1926b97c5522cab8a8e2a649c551cb339793177e03fdee3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9378a45e-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '4ade802e6c7d80c4611f1e73bbec80ff1fd61c1ef32ad0f78fff510f84682ee2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32020480, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9378b0ac-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '90c20802688628810f80dfc8be7aa775f298595264d0050a27b41c1ed651071e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: ral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9378bd72-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '9e5daa3ca2c24a0efc661dadb6e404a52f57f1eed6ea8ede0f18c283ec6f7258'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9378c88a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '0d3e9f9bae3ef7e4b80593b8ebd5c959877fd7f2578738bc8f12936c213637f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.684220', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9378d6d6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '77ee855bc0d2a2aeca2c382988250a48c69e0b8094827db5eff79707a8364201'}]}, 'timestamp': '2025-11-29 06:56:48.687439', '_unique_id': '51fbb6236ffc464698218fa1d0ead9d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.689 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.689 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61edb836-92c0-4c8b-baea-48de2f012c02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.689487', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '9379343c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': '0b2010fa5cbf8acdcf71c760e62d4091387cc9a37a2e1be644c1bfbff9214eff'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.689487', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '9379408a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': '85b26c5a5eb88866e8d24c15b5ebff5647a63c752c6d3c502ccafb5010c5d1e4'}]}, 'timestamp': '2025-11-29 06:56:48.690203', '_unique_id': 'e097e5ef86bc43cab96f17bd09333f51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.690 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.692 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.692 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>]
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.692 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.693 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da21520d-1221-428c-9921-ce0e8ac3f4a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.692808', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '9379b5ba-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': '2d4b5f3af03980c89594ac0fc23a8b41cdcccbee9ea28ba247eab0cc4400bc90'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.692808', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '9379c5aa-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': '046c7c04902fd83575eb99917c60762b194639cf038807643c7d5d95547ae7fe'}]}, 'timestamp': '2025-11-29 06:56:48.693547', '_unique_id': '9f25734cb6994046a74c17de4733f6b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.694 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.695 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.695 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.696 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.write.bytes volume: 274432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.696 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.696 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.write.bytes volume: 335872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.696 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.697 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.697 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f585b98-efeb-4d24-9683-9fc614db8057', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937a1c76-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': 'a205dfd63f8ce919a723ca31ea9d2cb75690a0c02f5aaf1826326f7c49557f6c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937a27de-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '5e4c5492f0568db600305de87ec71d5654a6dcf7dd1adb088e57c0276c9c741d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274432, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937a3292-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '861c70b3f7c7e1f5c9f9d53f2b94fc9d452ea154e8a9132c9841377d69ac41f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937a4278-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': 'f81cad5a11d75d4e563d3707ee3f3f20564a6aa959a8e5725e986d39c1a3652c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 335872, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937a4d4a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '79eef62e02d4a930f3397f8d4efc39461ebed8ac970091d41d1b0f76b93e974d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': '
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: gb': 1, 'disk_name': 'sda'}, 'message_id': '937a57b8-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '8bde2f1c9c51d8a6a4e1d8186c79c027de542df3d56eb7c0c92ca325ddb09b1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937a65a0-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': 'e7391018bf4c87fdf599bb502bbf88e4f456b29b90581c6ba5b6efe3a1234d45'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.695440', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937a7090-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '46932f3dbdcb6b552da990de9b7e2b0ebce103c1b7771d7c374a6be8b6bde1f8'}]}, 'timestamp': '2025-11-29 06:56:48.697900', '_unique_id': 'de1b9afc0e0f4b4c938be4a9e9681dcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.700 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.700 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.700 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af432066-5361-48dc-80eb-62f8da540a98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.700133', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '937ad4ea-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': 'bce022c387d6428e04cd640abd746a028c9071cd4e5c614a6653f3b3184362fb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.700133', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '937ae8e0-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': 'f859af59021087a0d2fadd45b4e4550341e3f44aedee7f73c66c1022a5bb5225'}]}, 'timestamp': '2025-11-29 06:56:48.701076', '_unique_id': '908578993c88460d9de51f559fc693ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.702 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.703 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.703 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.704 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.704 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.704 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.write.requests volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.705 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.705 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.705 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b085f66b-aab8-4023-9c0f-b932f9cb5670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937b50aa-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': '8af8c0e37faf5c6c61e8cfb8caf01640b22ffeb3c3d638631d0d0c0ead96f66c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937b60fe-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.354881963, 'message_signature': 'c1efbf254805f104e7aaf3d845dc72ad3de6b1b6594efea2bcdcaf7184e55bd4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937b6ec8-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': 'a77e3d12b6ab0a36dbaffa7ac2b20ae351421d506ae0af1ad5fb9c6d8b59a2fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937b7f12-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.383055278, 'message_signature': '52c89d311d10a4751d085e118e6e730835fa306d577ed0a4b49fd3800b66ca08'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 41, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937b8bce-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '61c5ab00a8548beb89b2779a90e2b05a1ca313ea46caf6c959b45d666172cb8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: : 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937b9790-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.412621692, 'message_signature': '1931bd61cc4155ce28ecb44210fed513d39519a1eff158678972d02c1f229177'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937ba6fe-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '2123b59f74c38ed09d69d0ed1a09a7179830001cca6d75993940d9048b477288'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.703280', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937bb392-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.452281979, 'message_signature': '99ef9026b74c67ac948520df57645778d09bfc75f06c4fca2a81d6e0d320e51b'}]}, 'timestamp': '2025-11-29 06:56:48.706188', '_unique_id': 'dac56156e09d41d29235a1340689b242'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.708 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.708 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/cpu volume: 2420000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.709 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/cpu volume: 11900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.709 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/cpu volume: 12180000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.709 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/cpu volume: 1270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c27e277-16e5-4a22-ac50-0c372ca464f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2420000000, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'timestamp': '2025-11-29T06:56:48.708813', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '937c2782-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.558649139, 'message_signature': '2b4752c82ff2506e9a48d1f2e6a5cb9c7ca036a61891f52eef54e3ef0d27f7aa'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11900000000, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'timestamp': '2025-11-29T06:56:48.708813', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '937c3420-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.573853302, 'message_signature': '3a63c22a74d7e1b8fa07215ec7d9f495cecfb26a9090997922c45e7266e1dade'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12180000000, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'timestamp': '2025-11-29T06:56:48.708813', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '937c4140-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.589530467, 'message_signature': '8ea8c0386420f1fe8b32bd22d83fdf10eb8cbfd105022b043acb7732fd8502d5'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1270000000, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'timestamp': '2025-11-29T06:56:48.708813', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '937c50ae-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.606382855, 'message_signature': '2a5bd60e7e96bc35aed61e76734e3c65b888dec97426f1bfd40ad435e81b1d59'}]}, 'timestamp': '2025-11-29 06:56:48.710218', '_unique_id': 'e27fdd9d82f14cbab32806b10176db3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.711 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.712 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.712 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.713 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.713 12 DEBUG ceilometer.compute.pollsters [-] a7c7d375-ef91-4869-987b-662d0c1de55c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.713 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.713 12 DEBUG ceilometer.compute.pollsters [-] 66b9235f-7cc8-40d4-877b-b690613298a4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.713 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.714 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28959fb5-238f-47bb-b92b-191b56c75e6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-vda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937cb0bc-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.481789761, 'message_signature': '9f9a6cb26b8dc29246ffcff6849d07bd880957df88604f53b841608c819b9f75'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8-sda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'instance-00000025', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937cbdb4-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.481789761, 'message_signature': '44d604311ce10f897bf28aa52d4c7babfaff58c173eed854739844129dd0b179'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-vda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937cc8ea-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.494161997, 'message_signature': 'e9a26100bb1a3abb49f0feecb8029e7e442ea6744708cceb2f4166abb1c6c266'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c-sda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-989129995', 'name': 'instance-00000023', 'instance_id': 'a7c7d375-ef91-4869-987b-662d0c1de55c', 'instance_type': 'm1.micro', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937cd114-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.494161997, 'message_signature': '8fbb050f356b07484470d739f1e66c77387b20da71ba21b3b5ed9ca1f6470f56'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-vda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937cda74-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.503322785, 'message_signature': 'caf6a4d49f936203c64469284c722b7660581b1c5592ddd95ea4b768a55607c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '66b9235f-7cc8-40d4-877b-b690613298a4-sda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-2086906237', 'name': 'instance-00000021', 'instance_id': '66b9235f-7cc8-40d4-877b-b690613298a4', 'instance_type': 'm1.nano', 'host': '5183370ecfd17e4e27441e12a4edc726ec022b91dff4acbd327f259f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_typ
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: me': 'sda'}, 'message_id': '937ce294-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.503322785, 'message_signature': 'bd77d76445f8d9b1862e514b1d1e8737fb733cca259ba1ecd398fb4ab78490be'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-vda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '937ced2a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.512151916, 'message_signature': 'edf7238943a1ef29e32123ac3eedfde6f2251af1021f28645bfbb5a13dfdc5d1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11-sda', 'timestamp': '2025-11-29T06:56:48.712291', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'instance-00000026', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '937cf57c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.512151916, 'message_signature': '5b5728d4d57a40a765a7ca2049cc9966defeb55b00404d01a21676bbb1a6574c'}]}, 'timestamp': '2025-11-29 06:56:48.714399', '_unique_id': '3d4c4031d4ac45b194c7e8d35dd9e5d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.715 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c32f181-fae4-4e65-bc73-03ce52833dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.715844', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '937d399c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': '9822034dfb256bafeffc2da57d0a3acce3fb0d805bb993b6bfd9a50d4f625f75'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.715844', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '937d431a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': 'c1dbca1494eff367b597507e0d6fb72c68f46b286ef31778f804b5842f52b63c'}]}, 'timestamp': '2025-11-29 06:56:48.716367', '_unique_id': '7dbfa49bd3854c77bbf4324522a1e4f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.716 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.717 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.717 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.717 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-2103374378>, <NovaLikeServer: tempest-MigrationsAdminTest-server-989129995>, <NovaLikeServer: tempest-MigrationsAdminTest-server-2086906237>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-419063069>]
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.717 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0e62a1c-ce88-47ed-97eb-495220e0511f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.717837', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '937d8500-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': 'e1f978fdcb943af69ecfa9e3a060ac368c2f3560897f08bc0d90a8e764ff2d36'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.717837', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '937d900e-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': 'b8a0c82c44a5068f69198ece8e703fa36485d5196061b98b13b2c63a2f81b40c'}]}, 'timestamp': '2025-11-29 06:56:48.718338', '_unique_id': '6f3a0e474bd54e24b9bf0631a750f6a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.718 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.719 12 DEBUG ceilometer.compute.pollsters [-] 6c18039e-ddd3-49b6-8323-00aca3672fd8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.719 12 DEBUG ceilometer.compute.pollsters [-] fac6a6d5-8640-43cd-9270-01d80282ca11/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5cdd657-8e95-4375-b952-a21219459f61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_name': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_name': None, 'resource_id': 'instance-00000025-6c18039e-ddd3-49b6-8323-00aca3672fd8-tap58d29b48-4b', 'timestamp': '2025-11-29T06:56:48.719660', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-2103374378', 'name': 'tap58d29b48-4b', 'instance_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'instance_type': 'm1.nano', 'host': '5f18a7eedbf8b397e2f8d4722ab930b84d4b3cf30c8a2f23df32a8d0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:5e:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58d29b48-4b'}, 'message_id': '937dcccc-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.337508941, 'message_signature': 'f6cab132ba96115f8c719829a41c8e7620ada5715d686a43672b913005b87d31'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_name': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_name': None, 'resource_id': 'instance-00000026-fac6a6d5-8640-43cd-9270-01d80282ca11-tap13869d54-0e', 'timestamp': '2025-11-29T06:56:48.719660', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-419063069', 'name': 'tap13869d54-0e', 'instance_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'instance_type': 'm1.nano', 'host': '34b62d380805b7b38064cfa6eea651d2eca250b1418920ff150504ed', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0f:43:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap13869d54-0e'}, 'message_id': '937dd528-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4871.347737729, 'message_signature': '5bd297df0d3bdffe48d9849096b3613fec0fab8c6a427d1285b2cad7f05fee7d'}]}, 'timestamp': '2025-11-29 06:56:48.720154', '_unique_id': '61293a84780645b096021d1a56d7add1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:56:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:56:48.720 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.545 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.589 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.596 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.600 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.682 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.688 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.698 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.962 187156 DEBUG nova.compute.manager [req-5a2299a8-9f07-4250-a3db-e70ae5b46b2a req-bd804de3-bb89-4a12-88c0-b26a483e01dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.962 187156 DEBUG oslo_concurrency.lockutils [req-5a2299a8-9f07-4250-a3db-e70ae5b46b2a req-bd804de3-bb89-4a12-88c0-b26a483e01dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.963 187156 DEBUG oslo_concurrency.lockutils [req-5a2299a8-9f07-4250-a3db-e70ae5b46b2a req-bd804de3-bb89-4a12-88c0-b26a483e01dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.963 187156 DEBUG oslo_concurrency.lockutils [req-5a2299a8-9f07-4250-a3db-e70ae5b46b2a req-bd804de3-bb89-4a12-88c0-b26a483e01dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.963 187156 DEBUG nova.compute.manager [req-5a2299a8-9f07-4250-a3db-e70ae5b46b2a req-bd804de3-bb89-4a12-88c0-b26a483e01dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] No waiting events found dispatching network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:56:48 np0005539504 nova_compute[187152]: 2025-11-29 06:56:48.963 187156 WARNING nova.compute.manager [req-5a2299a8-9f07-4250-a3db-e70ae5b46b2a req-bd804de3-bb89-4a12-88c0-b26a483e01dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received unexpected event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.707 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 06:56:48.715 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 01:56:49 np0005539504 nova_compute[187152]: 2025-11-29 06:56:49.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:50 np0005539504 nova_compute[187152]: 2025-11-29 06:56:50.199 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:50 np0005539504 NetworkManager[55210]: <info>  [1764399410.6074] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 29 01:56:50 np0005539504 NetworkManager[55210]: <info>  [1764399410.6089] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 29 01:56:50 np0005539504 nova_compute[187152]: 2025-11-29 06:56:50.608 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:50 np0005539504 nova_compute[187152]: 2025-11-29 06:56:50.682 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:50 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:50Z|00122|binding|INFO|Releasing lport 4919a746-8c41-4e1b-93e2-17dfe2a5b063 from this chassis (sb_readonly=0)
Nov 29 01:56:50 np0005539504 ovn_controller[95182]: 2025-11-29T06:56:50Z|00123|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:56:50 np0005539504 nova_compute[187152]: 2025-11-29 06:56:50.738 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:56:52.367 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:52 np0005539504 podman[219843]: 2025-11-29 06:56:52.722792933 +0000 UTC m=+0.063772233 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:56:52 np0005539504 nova_compute[187152]: 2025-11-29 06:56:52.823 187156 DEBUG nova.compute.manager [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-changed-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:52 np0005539504 nova_compute[187152]: 2025-11-29 06:56:52.823 187156 DEBUG nova.compute.manager [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Refreshing instance network info cache due to event network-changed-13869d54-0ea7-412c-b676-18a5cb75a059. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:56:52 np0005539504 nova_compute[187152]: 2025-11-29 06:56:52.824 187156 DEBUG oslo_concurrency.lockutils [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:52 np0005539504 nova_compute[187152]: 2025-11-29 06:56:52.824 187156 DEBUG oslo_concurrency.lockutils [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:52 np0005539504 nova_compute[187152]: 2025-11-29 06:56:52.824 187156 DEBUG nova.network.neutron [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Refreshing network info cache for port 13869d54-0ea7-412c-b676-18a5cb75a059 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:56:53 np0005539504 nova_compute[187152]: 2025-11-29 06:56:53.152 187156 DEBUG nova.compute.manager [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-changed-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:53 np0005539504 nova_compute[187152]: 2025-11-29 06:56:53.153 187156 DEBUG nova.compute.manager [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Refreshing instance network info cache due to event network-changed-58d29b48-4b4d-4014-93d0-3ecea2472a3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:56:53 np0005539504 nova_compute[187152]: 2025-11-29 06:56:53.154 187156 DEBUG oslo_concurrency.lockutils [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:53 np0005539504 nova_compute[187152]: 2025-11-29 06:56:53.154 187156 DEBUG oslo_concurrency.lockutils [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:53 np0005539504 nova_compute[187152]: 2025-11-29 06:56:53.155 187156 DEBUG nova.network.neutron [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Refreshing network info cache for port 58d29b48-4b4d-4014-93d0-3ecea2472a3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.662 187156 DEBUG nova.network.neutron [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updated VIF entry in instance network info cache for port 58d29b48-4b4d-4014-93d0-3ecea2472a3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.663 187156 DEBUG nova.network.neutron [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updating instance_info_cache with network_info: [{"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.854 187156 DEBUG oslo_concurrency.lockutils [req-b0259e37-5863-43ae-9a7f-b90809f63faf req-9fe4e11c-0597-43e1-b68b-400a7826379b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6c18039e-ddd3-49b6-8323-00aca3672fd8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.912 187156 DEBUG nova.compute.manager [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-changed-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.913 187156 DEBUG nova.compute.manager [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Refreshing instance network info cache due to event network-changed-13869d54-0ea7-412c-b676-18a5cb75a059. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:56:54 np0005539504 nova_compute[187152]: 2025-11-29 06:56:54.914 187156 DEBUG oslo_concurrency.lockutils [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:55 np0005539504 nova_compute[187152]: 2025-11-29 06:56:55.202 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:55 np0005539504 nova_compute[187152]: 2025-11-29 06:56:55.295 187156 DEBUG nova.network.neutron [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updated VIF entry in instance network info cache for port 13869d54-0ea7-412c-b676-18a5cb75a059. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:56:55 np0005539504 nova_compute[187152]: 2025-11-29 06:56:55.297 187156 DEBUG nova.network.neutron [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updating instance_info_cache with network_info: [{"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:55 np0005539504 nova_compute[187152]: 2025-11-29 06:56:55.804 187156 DEBUG oslo_concurrency.lockutils [req-d2cf08eb-bbba-444e-a802-ca986b3cb26d req-edb1020f-3734-490f-aaca-07abf69c0b52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:55 np0005539504 nova_compute[187152]: 2025-11-29 06:56:55.806 187156 DEBUG oslo_concurrency.lockutils [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:55 np0005539504 nova_compute[187152]: 2025-11-29 06:56:55.806 187156 DEBUG nova.network.neutron [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Refreshing network info cache for port 13869d54-0ea7-412c-b676-18a5cb75a059 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:56:56 np0005539504 podman[219862]: 2025-11-29 06:56:56.731938354 +0000 UTC m=+0.062942341 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:56:56 np0005539504 podman[219863]: 2025-11-29 06:56:56.747115996 +0000 UTC m=+0.073092237 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 29 01:56:57 np0005539504 nova_compute[187152]: 2025-11-29 06:56:57.257 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399402.2554219, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:57 np0005539504 nova_compute[187152]: 2025-11-29 06:56:57.258 187156 INFO nova.compute.manager [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:56:58 np0005539504 nova_compute[187152]: 2025-11-29 06:56:58.134 187156 DEBUG nova.compute.manager [None req-2c58a8ae-f6d6-423f-ac2e-22d15567875f - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:59 np0005539504 nova_compute[187152]: 2025-11-29 06:56:59.292 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:59 np0005539504 nova_compute[187152]: 2025-11-29 06:56:59.423 187156 DEBUG nova.network.neutron [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updated VIF entry in instance network info cache for port 13869d54-0ea7-412c-b676-18a5cb75a059. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:56:59 np0005539504 nova_compute[187152]: 2025-11-29 06:56:59.424 187156 DEBUG nova.network.neutron [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updating instance_info_cache with network_info: [{"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:59 np0005539504 nova_compute[187152]: 2025-11-29 06:56:59.512 187156 DEBUG oslo_concurrency.lockutils [req-f194d696-75af-45ad-9f3f-b037429a1dcd req-fab1b079-1494-4f7d-ada9-005c55070d9d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-fac6a6d5-8640-43cd-9270-01d80282ca11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:00 np0005539504 nova_compute[187152]: 2025-11-29 06:57:00.206 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:01Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:5e:37 10.100.0.6
Nov 29 01:57:01 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:01Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:5e:37 10.100.0.6
Nov 29 01:57:03 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:03Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0f:43:95 10.100.0.8
Nov 29 01:57:03 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:03Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:43:95 10.100.0.8
Nov 29 01:57:04 np0005539504 nova_compute[187152]: 2025-11-29 06:57:04.293 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539504 nova_compute[187152]: 2025-11-29 06:57:05.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539504 podman[219942]: 2025-11-29 06:57:05.745389267 +0000 UTC m=+0.068937973 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:57:05 np0005539504 podman[219943]: 2025-11-29 06:57:05.816278573 +0000 UTC m=+0.139820839 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:57:09 np0005539504 nova_compute[187152]: 2025-11-29 06:57:09.296 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:09 np0005539504 podman[219991]: 2025-11-29 06:57:09.731053328 +0000 UTC m=+0.072516501 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:57:10 np0005539504 nova_compute[187152]: 2025-11-29 06:57:10.221 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:13 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:13Z|00124|binding|INFO|Releasing lport 4919a746-8c41-4e1b-93e2-17dfe2a5b063 from this chassis (sb_readonly=0)
Nov 29 01:57:13 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:13Z|00125|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:57:13 np0005539504 nova_compute[187152]: 2025-11-29 06:57:13.731 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:14 np0005539504 nova_compute[187152]: 2025-11-29 06:57:14.298 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:14 np0005539504 podman[220012]: 2025-11-29 06:57:14.732557885 +0000 UTC m=+0.066833037 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 01:57:15 np0005539504 nova_compute[187152]: 2025-11-29 06:57:15.226 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:19 np0005539504 nova_compute[187152]: 2025-11-29 06:57:19.299 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:20 np0005539504 nova_compute[187152]: 2025-11-29 06:57:20.228 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:21 np0005539504 nova_compute[187152]: 2025-11-29 06:57:21.132 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:22.912 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:22.913 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:22.914 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:23 np0005539504 podman[220047]: 2025-11-29 06:57:23.736916278 +0000 UTC m=+0.071864476 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:57:24 np0005539504 nova_compute[187152]: 2025-11-29 06:57:24.302 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:25 np0005539504 nova_compute[187152]: 2025-11-29 06:57:25.231 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:27 np0005539504 nova_compute[187152]: 2025-11-29 06:57:27.140 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:27 np0005539504 podman[220068]: 2025-11-29 06:57:27.713546369 +0000 UTC m=+0.053492438 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:57:27 np0005539504 podman[220069]: 2025-11-29 06:57:27.763046089 +0000 UTC m=+0.093412909 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350)
Nov 29 01:57:29 np0005539504 nova_compute[187152]: 2025-11-29 06:57:29.303 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:30 np0005539504 nova_compute[187152]: 2025-11-29 06:57:30.233 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:34 np0005539504 nova_compute[187152]: 2025-11-29 06:57:34.306 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.236 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.830 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.831 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.831 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.832 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.832 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.846 187156 INFO nova.compute.manager [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Terminating instance#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.858 187156 DEBUG nova.compute.manager [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:57:35 np0005539504 kernel: tap13869d54-0e (unregistering): left promiscuous mode
Nov 29 01:57:35 np0005539504 NetworkManager[55210]: <info>  [1764399455.8925] device (tap13869d54-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.905 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:35 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:35Z|00126|binding|INFO|Releasing lport 13869d54-0ea7-412c-b676-18a5cb75a059 from this chassis (sb_readonly=0)
Nov 29 01:57:35 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:35Z|00127|binding|INFO|Setting lport 13869d54-0ea7-412c-b676-18a5cb75a059 down in Southbound
Nov 29 01:57:35 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:35Z|00128|binding|INFO|Removing iface tap13869d54-0e ovn-installed in OVS
Nov 29 01:57:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:35.917 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:43:95 10.100.0.8'], port_security=['fa:16:3e:0f:43:95 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'fac6a6d5-8640-43cd-9270-01d80282ca11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09d1caf9-4b04-433c-8535-2cd6d44437db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32234968781646cf869d42134e62b91c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1fb16d9c-331a-41c7-a6da-9b1479dbf50c 79674fae-6d66-4004-b6b7-1c2cffc0c7be e04c2377-e75c-4df2-b711-128f063a06c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d714e3-e817-4d0b-99e9-4cc2314001af, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=13869d54-0ea7-412c-b676-18a5cb75a059) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:35.920 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 13869d54-0ea7-412c-b676-18a5cb75a059 in datapath 09d1caf9-4b04-433c-8535-2cd6d44437db unbound from our chassis#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:35.923 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09d1caf9-4b04-433c-8535-2cd6d44437db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:57:35 np0005539504 nova_compute[187152]: 2025-11-29 06:57:35.926 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:35.925 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec68b72-1c82-46f1-a884-873dab93bee4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:35.927 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db namespace which is not needed anymore#033[00m
Nov 29 01:57:35 np0005539504 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000026.scope: Deactivated successfully.
Nov 29 01:57:35 np0005539504 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000026.scope: Consumed 16.164s CPU time.
Nov 29 01:57:35 np0005539504 systemd-machined[153423]: Machine qemu-22-instance-00000026 terminated.
Nov 29 01:57:35 np0005539504 podman[220112]: 2025-11-29 06:57:35.972420556 +0000 UTC m=+0.060746037 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:57:36 np0005539504 podman[220115]: 2025-11-29 06:57:36.011198765 +0000 UTC m=+0.094996323 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:57:36 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [NOTICE]   (219830) : haproxy version is 2.8.14-c23fe91
Nov 29 01:57:36 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [NOTICE]   (219830) : path to executable is /usr/sbin/haproxy
Nov 29 01:57:36 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [WARNING]  (219830) : Exiting Master process...
Nov 29 01:57:36 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [WARNING]  (219830) : Exiting Master process...
Nov 29 01:57:36 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [ALERT]    (219830) : Current worker (219832) exited with code 143 (Terminated)
Nov 29 01:57:36 np0005539504 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[219826]: [WARNING]  (219830) : All workers exited. Exiting... (0)
Nov 29 01:57:36 np0005539504 systemd[1]: libpod-755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af.scope: Deactivated successfully.
Nov 29 01:57:36 np0005539504 conmon[219826]: conmon 755d76af2dee4988c53f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af.scope/container/memory.events
Nov 29 01:57:36 np0005539504 podman[220187]: 2025-11-29 06:57:36.063224353 +0000 UTC m=+0.041800052 container died 755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:57:36 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af-userdata-shm.mount: Deactivated successfully.
Nov 29 01:57:36 np0005539504 systemd[1]: var-lib-containers-storage-overlay-b52f9026e97650dbc64c1bfe6bd16aaf25e29416fe7f291f7f6018e6d018709a-merged.mount: Deactivated successfully.
Nov 29 01:57:36 np0005539504 podman[220187]: 2025-11-29 06:57:36.107645426 +0000 UTC m=+0.086221125 container cleanup 755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:57:36 np0005539504 systemd[1]: libpod-conmon-755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af.scope: Deactivated successfully.
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.126 187156 INFO nova.virt.libvirt.driver [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Instance destroyed successfully.#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.127 187156 DEBUG nova.objects.instance [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'resources' on Instance uuid fac6a6d5-8640-43cd-9270-01d80282ca11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.143 187156 DEBUG nova.virt.libvirt.vif [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-419063069',display_name='tempest-SecurityGroupsTestJSON-server-419063069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-419063069',id=38,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:56:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-70qqewkf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:56:47Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=fac6a6d5-8640-43cd-9270-01d80282ca11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.143 187156 DEBUG nova.network.os_vif_util [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "13869d54-0ea7-412c-b676-18a5cb75a059", "address": "fa:16:3e:0f:43:95", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13869d54-0e", "ovs_interfaceid": "13869d54-0ea7-412c-b676-18a5cb75a059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.144 187156 DEBUG nova.network.os_vif_util [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.144 187156 DEBUG os_vif [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.147 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.148 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13869d54-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.150 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.154 187156 INFO os_vif [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:43:95,bridge_name='br-int',has_traffic_filtering=True,id=13869d54-0ea7-412c-b676-18a5cb75a059,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13869d54-0e')#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.154 187156 INFO nova.virt.libvirt.driver [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Deleting instance files /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11_del#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.155 187156 INFO nova.virt.libvirt.driver [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Deletion of /var/lib/nova/instances/fac6a6d5-8640-43cd-9270-01d80282ca11_del complete#033[00m
Nov 29 01:57:36 np0005539504 podman[220234]: 2025-11-29 06:57:36.174646739 +0000 UTC m=+0.045356458 container remove 755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.179 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[61e232fe-0361-4ad2-8916-1e3524440cfc]: (4, ('Sat Nov 29 06:57:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db (755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af)\n755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af\nSat Nov 29 06:57:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db (755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af)\n755d76af2dee4988c53fd528a79b0b4c323a1fa157d26649cf703d93500581af\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.181 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[75664196-9663-4409-b242-184b0547b637]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.182 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d1caf9-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.184 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:36 np0005539504 kernel: tap09d1caf9-40: left promiscuous mode
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.195 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.198 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8250e43b-afde-4949-8f0d-c82a866214e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.210 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fb275e17-c799-41f3-a48b-c0e8eb492681]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.212 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f02511da-e7f8-42e2-a77c-e0dbb90d3756]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.230 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[71d18ff4-e471-49c4-94d3-d6de6c11cc62]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486933, 'reachable_time': 29283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220248, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.233 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:57:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:36.233 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[5d90cb18-d2a0-4ad5-b36d-ef074ca9a81e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:36 np0005539504 systemd[1]: run-netns-ovnmeta\x2d09d1caf9\x2d4b04\x2d433c\x2d8535\x2d2cd6d44437db.mount: Deactivated successfully.
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.242 187156 INFO nova.compute.manager [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.243 187156 DEBUG oslo.service.loopingcall [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.243 187156 DEBUG nova.compute.manager [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.243 187156 DEBUG nova.network.neutron [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.416 187156 DEBUG nova.compute.manager [req-d8202337-4008-4702-b344-f39bfa264a72 req-c14dcb85-42be-47d4-bd60-0080753b5ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-vif-unplugged-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.417 187156 DEBUG oslo_concurrency.lockutils [req-d8202337-4008-4702-b344-f39bfa264a72 req-c14dcb85-42be-47d4-bd60-0080753b5ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.417 187156 DEBUG oslo_concurrency.lockutils [req-d8202337-4008-4702-b344-f39bfa264a72 req-c14dcb85-42be-47d4-bd60-0080753b5ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.417 187156 DEBUG oslo_concurrency.lockutils [req-d8202337-4008-4702-b344-f39bfa264a72 req-c14dcb85-42be-47d4-bd60-0080753b5ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.418 187156 DEBUG nova.compute.manager [req-d8202337-4008-4702-b344-f39bfa264a72 req-c14dcb85-42be-47d4-bd60-0080753b5ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] No waiting events found dispatching network-vif-unplugged-13869d54-0ea7-412c-b676-18a5cb75a059 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:36 np0005539504 nova_compute[187152]: 2025-11-29 06:57:36.418 187156 DEBUG nova.compute.manager [req-d8202337-4008-4702-b344-f39bfa264a72 req-c14dcb85-42be-47d4-bd60-0080753b5ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-vif-unplugged-13869d54-0ea7-412c-b676-18a5cb75a059 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.445 187156 DEBUG nova.network.neutron [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.467 187156 INFO nova.compute.manager [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Took 1.22 seconds to deallocate network for instance.#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.537 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.537 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.630 187156 DEBUG nova.compute.provider_tree [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.642 187156 DEBUG nova.scheduler.client.report [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.660 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.690 187156 INFO nova.scheduler.client.report [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Deleted allocations for instance fac6a6d5-8640-43cd-9270-01d80282ca11#033[00m
Nov 29 01:57:37 np0005539504 nova_compute[187152]: 2025-11-29 06:57:37.759 187156 DEBUG oslo_concurrency.lockutils [None req-69a8fb4a-0ebd-4212-ad46-3dd5a680043a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.487 187156 DEBUG nova.compute.manager [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.487 187156 DEBUG oslo_concurrency.lockutils [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.488 187156 DEBUG oslo_concurrency.lockutils [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.488 187156 DEBUG oslo_concurrency.lockutils [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fac6a6d5-8640-43cd-9270-01d80282ca11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.488 187156 DEBUG nova.compute.manager [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] No waiting events found dispatching network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.488 187156 WARNING nova.compute.manager [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received unexpected event network-vif-plugged-13869d54-0ea7-412c-b676-18a5cb75a059 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:57:38 np0005539504 nova_compute[187152]: 2025-11-29 06:57:38.489 187156 DEBUG nova.compute.manager [req-e2c32bb7-d38a-4830-97e6-4f89509401b5 req-84905c29-cea7-4317-8eb9-4acbb480b254 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Received event network-vif-deleted-13869d54-0ea7-412c-b676-18a5cb75a059 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:39 np0005539504 nova_compute[187152]: 2025-11-29 06:57:39.314 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:40 np0005539504 podman[220249]: 2025-11-29 06:57:40.746299659 +0000 UTC m=+0.066944923 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:57:40 np0005539504 nova_compute[187152]: 2025-11-29 06:57:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:41 np0005539504 nova_compute[187152]: 2025-11-29 06:57:41.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:41 np0005539504 nova_compute[187152]: 2025-11-29 06:57:41.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:41 np0005539504 nova_compute[187152]: 2025-11-29 06:57:41.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:42 np0005539504 nova_compute[187152]: 2025-11-29 06:57:42.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:43 np0005539504 nova_compute[187152]: 2025-11-29 06:57:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:44 np0005539504 nova_compute[187152]: 2025-11-29 06:57:44.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:45 np0005539504 podman[220270]: 2025-11-29 06:57:45.723506859 +0000 UTC m=+0.070446098 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:57:45 np0005539504 nova_compute[187152]: 2025-11-29 06:57:45.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:45 np0005539504 nova_compute[187152]: 2025-11-29 06:57:45.969 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:45 np0005539504 nova_compute[187152]: 2025-11-29 06:57:45.969 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:45 np0005539504 nova_compute[187152]: 2025-11-29 06:57:45.969 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:45 np0005539504 nova_compute[187152]: 2025-11-29 06:57:45.969 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.064 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.143 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.144 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:46 np0005539504 ovn_controller[95182]: 2025-11-29T06:57:46Z|00129|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.163 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.194 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.226 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.232 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.303 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.304 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.360 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.367 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.423 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.424 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.482 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.656 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.658 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5251MB free_disk=73.11521530151367GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.659 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.659 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.743 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 66b9235f-7cc8-40d4-877b-b690613298a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.744 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance a7c7d375-ef91-4869-987b-662d0c1de55c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.744 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 6c18039e-ddd3-49b6-8323-00aca3672fd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.744 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.745 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.817 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.835 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.880 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:57:46 np0005539504 nova_compute[187152]: 2025-11-29 06:57:46.881 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:47.845 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:47.847 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:57:47 np0005539504 nova_compute[187152]: 2025-11-29 06:57:47.846 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:57:47.848 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:47 np0005539504 nova_compute[187152]: 2025-11-29 06:57:47.881 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:47 np0005539504 nova_compute[187152]: 2025-11-29 06:57:47.881 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:57:48 np0005539504 nova_compute[187152]: 2025-11-29 06:57:48.414 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:48 np0005539504 nova_compute[187152]: 2025-11-29 06:57:48.415 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:48 np0005539504 nova_compute[187152]: 2025-11-29 06:57:48.415 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:57:48 np0005539504 nova_compute[187152]: 2025-11-29 06:57:48.552 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.005 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.027 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.028 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.028 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.029 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.029 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.079 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:49 np0005539504 nova_compute[187152]: 2025-11-29 06:57:49.320 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:51 np0005539504 nova_compute[187152]: 2025-11-29 06:57:51.125 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399456.123817, fac6a6d5-8640-43cd-9270-01d80282ca11 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:51 np0005539504 nova_compute[187152]: 2025-11-29 06:57:51.125 187156 INFO nova.compute.manager [-] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:57:51 np0005539504 nova_compute[187152]: 2025-11-29 06:57:51.149 187156 DEBUG nova.compute.manager [None req-50239354-41b8-4d0f-ae68-b7d3b6ad134c - - - - - -] [instance: fac6a6d5-8640-43cd-9270-01d80282ca11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:51 np0005539504 nova_compute[187152]: 2025-11-29 06:57:51.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:54 np0005539504 nova_compute[187152]: 2025-11-29 06:57:54.323 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:54 np0005539504 podman[220318]: 2025-11-29 06:57:54.750763007 +0000 UTC m=+0.091529908 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:57:54 np0005539504 nova_compute[187152]: 2025-11-29 06:57:54.988 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.413 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "a7c7d375-ef91-4869-987b-662d0c1de55c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.414 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.414 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "a7c7d375-ef91-4869-987b-662d0c1de55c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.414 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.414 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.427 187156 INFO nova.compute.manager [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Terminating instance#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.440 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.440 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.440 187156 DEBUG nova.network.neutron [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:55 np0005539504 nova_compute[187152]: 2025-11-29 06:57:55.902 187156 DEBUG nova.network.neutron [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.167 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.220 187156 DEBUG nova.network.neutron [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.236 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.237 187156 DEBUG nova.compute.manager [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:57:56 np0005539504 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Deactivated successfully.
Nov 29 01:57:56 np0005539504 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Consumed 17.004s CPU time.
Nov 29 01:57:56 np0005539504 systemd-machined[153423]: Machine qemu-19-instance-00000023 terminated.
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.503 187156 INFO nova.virt.libvirt.driver [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance destroyed successfully.#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.505 187156 DEBUG nova.objects.instance [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.530 187156 INFO nova.virt.libvirt.driver [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Deleting instance files /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_del#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.539 187156 INFO nova.virt.libvirt.driver [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Deletion of /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_del complete#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.683 187156 INFO nova.compute.manager [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.684 187156 DEBUG oslo.service.loopingcall [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.684 187156 DEBUG nova.compute.manager [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.684 187156 DEBUG nova.network.neutron [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.860 187156 DEBUG nova.network.neutron [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.871 187156 DEBUG nova.network.neutron [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.890 187156 INFO nova.compute.manager [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Took 0.21 seconds to deallocate network for instance.#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.985 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:56 np0005539504 nova_compute[187152]: 2025-11-29 06:57:56.986 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:57 np0005539504 nova_compute[187152]: 2025-11-29 06:57:57.090 187156 DEBUG nova.compute.provider_tree [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:57 np0005539504 nova_compute[187152]: 2025-11-29 06:57:57.104 187156 DEBUG nova.scheduler.client.report [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:57 np0005539504 nova_compute[187152]: 2025-11-29 06:57:57.126 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:57 np0005539504 nova_compute[187152]: 2025-11-29 06:57:57.160 187156 INFO nova.scheduler.client.report [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Deleted allocations for instance a7c7d375-ef91-4869-987b-662d0c1de55c#033[00m
Nov 29 01:57:57 np0005539504 nova_compute[187152]: 2025-11-29 06:57:57.293 187156 DEBUG oslo_concurrency.lockutils [None req-fbab6098-62de-4f24-aa2c-c5c616aa757a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.325 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.426 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "66b9235f-7cc8-40d4-877b-b690613298a4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.427 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.427 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "66b9235f-7cc8-40d4-877b-b690613298a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.427 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.428 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.439 187156 INFO nova.compute.manager [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Terminating instance#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.449 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.450 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.450 187156 DEBUG nova.network.neutron [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.715 187156 DEBUG nova.network.neutron [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:58 np0005539504 podman[220348]: 2025-11-29 06:57:58.726394773 +0000 UTC m=+0.064331293 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:57:58 np0005539504 podman[220349]: 2025-11-29 06:57:58.739318662 +0000 UTC m=+0.068301200 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:57:58 np0005539504 nova_compute[187152]: 2025-11-29 06:57:58.994 187156 DEBUG nova.network.neutron [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.012 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.013 187156 DEBUG nova.compute.manager [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:57:59 np0005539504 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000021.scope: Deactivated successfully.
Nov 29 01:57:59 np0005539504 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000021.scope: Consumed 18.714s CPU time.
Nov 29 01:57:59 np0005539504 systemd-machined[153423]: Machine qemu-18-instance-00000021 terminated.
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.276 187156 INFO nova.virt.libvirt.driver [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance destroyed successfully.#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.277 187156 DEBUG nova.objects.instance [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.303 187156 INFO nova.virt.libvirt.driver [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Deleting instance files /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_del#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.306 187156 INFO nova.virt.libvirt.driver [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Deletion of /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_del complete#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.410 187156 INFO nova.compute.manager [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.410 187156 DEBUG oslo.service.loopingcall [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.411 187156 DEBUG nova.compute.manager [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:57:59 np0005539504 nova_compute[187152]: 2025-11-29 06:57:59.411 187156 DEBUG nova.network.neutron [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:58:00 np0005539504 nova_compute[187152]: 2025-11-29 06:58:00.449 187156 DEBUG nova.network.neutron [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.170 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.196 187156 DEBUG nova.network.neutron [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.234 187156 INFO nova.compute.manager [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Took 1.82 seconds to deallocate network for instance.#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.320 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.320 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.385 187156 DEBUG nova.compute.provider_tree [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.409 187156 DEBUG nova.scheduler.client.report [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.428 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.454 187156 INFO nova.scheduler.client.report [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Deleted allocations for instance 66b9235f-7cc8-40d4-877b-b690613298a4#033[00m
Nov 29 01:58:01 np0005539504 nova_compute[187152]: 2025-11-29 06:58:01.552 187156 DEBUG oslo_concurrency.lockutils [None req-062d6d94-8bc6-4505-85c0-3c82803f77fc 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:04 np0005539504 nova_compute[187152]: 2025-11-29 06:58:04.328 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:06 np0005539504 nova_compute[187152]: 2025-11-29 06:58:06.172 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:06 np0005539504 podman[220405]: 2025-11-29 06:58:06.726304118 +0000 UTC m=+0.065631377 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:58:06 np0005539504 podman[220406]: 2025-11-29 06:58:06.784132994 +0000 UTC m=+0.116784103 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:58:09 np0005539504 nova_compute[187152]: 2025-11-29 06:58:09.330 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:11 np0005539504 nova_compute[187152]: 2025-11-29 06:58:11.175 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:11 np0005539504 nova_compute[187152]: 2025-11-29 06:58:11.500 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399476.498236, a7c7d375-ef91-4869-987b-662d0c1de55c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:11 np0005539504 nova_compute[187152]: 2025-11-29 06:58:11.501 187156 INFO nova.compute.manager [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:58:11 np0005539504 nova_compute[187152]: 2025-11-29 06:58:11.527 187156 DEBUG nova.compute.manager [None req-575997ea-e960-4290-b25c-90f65ad81bab - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:11 np0005539504 podman[220456]: 2025-11-29 06:58:11.733398945 +0000 UTC m=+0.070823638 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:58:14 np0005539504 nova_compute[187152]: 2025-11-29 06:58:14.275 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399479.272744, 66b9235f-7cc8-40d4-877b-b690613298a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:14 np0005539504 nova_compute[187152]: 2025-11-29 06:58:14.275 187156 INFO nova.compute.manager [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:58:14 np0005539504 nova_compute[187152]: 2025-11-29 06:58:14.305 187156 DEBUG nova.compute.manager [None req-62ef8967-f616-437b-8c43-b332b74e3f73 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:14 np0005539504 nova_compute[187152]: 2025-11-29 06:58:14.332 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:16 np0005539504 nova_compute[187152]: 2025-11-29 06:58:16.178 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:16 np0005539504 podman[220477]: 2025-11-29 06:58:16.723288005 +0000 UTC m=+0.061148185 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 29 01:58:19 np0005539504 nova_compute[187152]: 2025-11-29 06:58:19.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:21 np0005539504 nova_compute[187152]: 2025-11-29 06:58:21.181 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:22.913 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:22.914 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:22.916 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:24 np0005539504 nova_compute[187152]: 2025-11-29 06:58:24.336 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:25 np0005539504 podman[220498]: 2025-11-29 06:58:25.720701475 +0000 UTC m=+0.053964101 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:58:26 np0005539504 nova_compute[187152]: 2025-11-29 06:58:26.183 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:27 np0005539504 ovn_controller[95182]: 2025-11-29T06:58:27Z|00130|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:58:28 np0005539504 nova_compute[187152]: 2025-11-29 06:58:28.030 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.339 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.654 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.655 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.694 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:58:29 np0005539504 podman[220517]: 2025-11-29 06:58:29.718211411 +0000 UTC m=+0.059951873 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:58:29 np0005539504 podman[220518]: 2025-11-29 06:58:29.736002303 +0000 UTC m=+0.071288631 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6)
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.816 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.818 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.832 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.833 187156 INFO nova.compute.claims [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:58:29 np0005539504 nova_compute[187152]: 2025-11-29 06:58:29.988 187156 DEBUG nova.scheduler.client.report [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.019 187156 DEBUG nova.scheduler.client.report [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.020 187156 DEBUG nova.compute.provider_tree [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.045 187156 DEBUG nova.scheduler.client.report [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.070 187156 DEBUG nova.scheduler.client.report [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.169 187156 DEBUG nova.compute.provider_tree [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.194 187156 DEBUG nova.scheduler.client.report [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.229 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.231 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.299 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.320 187156 INFO nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.352 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.525 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.528 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.529 187156 INFO nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Creating image(s)#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.530 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "/var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.531 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "/var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.532 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "/var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.566 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.660 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.661 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.662 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.677 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.732 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.733 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.775 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.776 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.777 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.836 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.837 187156 DEBUG nova.virt.disk.api [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Checking if we can resize image /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.838 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.904 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.905 187156 DEBUG nova.virt.disk.api [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Cannot resize image /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.906 187156 DEBUG nova.objects.instance [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lazy-loading 'migration_context' on Instance uuid 0fcd1c0b-29f4-4451-9bef-8dea4570678e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.919 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.920 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Ensure instance console log exists: /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.921 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.921 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.921 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.923 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.932 187156 WARNING nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.938 187156 DEBUG nova.virt.libvirt.host [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.939 187156 DEBUG nova.virt.libvirt.host [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.945 187156 DEBUG nova.virt.libvirt.host [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.945 187156 DEBUG nova.virt.libvirt.host [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.949 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.950 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.951 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.951 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.951 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.952 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.952 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.953 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.953 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.953 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.954 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.954 187156 DEBUG nova.virt.hardware [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.959 187156 DEBUG nova.objects.instance [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0fcd1c0b-29f4-4451-9bef-8dea4570678e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:30 np0005539504 nova_compute[187152]: 2025-11-29 06:58:30.972 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <uuid>0fcd1c0b-29f4-4451-9bef-8dea4570678e</uuid>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <name>instance-0000002e</name>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-558365222</nova:name>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:58:30</nova:creationTime>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:user uuid="797e6873bac8436499ebeac55b50a12a">tempest-ServerDiagnosticsV248Test-997453337-project-member</nova:user>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:        <nova:project uuid="7d8df8d3efa34120ae55e1082eb12e50">tempest-ServerDiagnosticsV248Test-997453337</nova:project>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <entry name="serial">0fcd1c0b-29f4-4451-9bef-8dea4570678e</entry>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <entry name="uuid">0fcd1c0b-29f4-4451-9bef-8dea4570678e</entry>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.config"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/console.log" append="off"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:58:30 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:58:30 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:58:30 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:58:30 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.031 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.032 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.033 187156 INFO nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Using config drive#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.185 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.387 187156 INFO nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Creating config drive at /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.config#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.393 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp58wt7jl9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:31 np0005539504 nova_compute[187152]: 2025-11-29 06:58:31.522 187156 DEBUG oslo_concurrency.processutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp58wt7jl9" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:31 np0005539504 systemd-machined[153423]: New machine qemu-23-instance-0000002e.
Nov 29 01:58:31 np0005539504 systemd[1]: Started Virtual Machine qemu-23-instance-0000002e.
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.176 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399512.1753047, 0fcd1c0b-29f4-4451-9bef-8dea4570678e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.177 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.179 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.179 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.184 187156 INFO nova.virt.libvirt.driver [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Instance spawned successfully.#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.185 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.215 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.221 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.225 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.225 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.225 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.226 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.226 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.226 187156 DEBUG nova.virt.libvirt.driver [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.252 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.252 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399512.1767519, 0fcd1c0b-29f4-4451-9bef-8dea4570678e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.252 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] VM Started (Lifecycle Event)#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.280 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.283 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.319 187156 INFO nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Took 1.79 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.319 187156 DEBUG nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.323 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.428 187156 INFO nova.compute.manager [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Took 2.67 seconds to build instance.#033[00m
Nov 29 01:58:32 np0005539504 nova_compute[187152]: 2025-11-29 06:58:32.448 187156 DEBUG oslo_concurrency.lockutils [None req-4020ed92-e98d-4b13-adfb-be16ed6542bb 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:34 np0005539504 nova_compute[187152]: 2025-11-29 06:58:34.321 187156 DEBUG nova.compute.manager [None req-37247e92-e66a-4556-b7c9-558ddb63c011 338655d3fc8345a58b781bcd1fbd2a28 3ccf59a5cab0438ca4f0506e49513e87 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:34 np0005539504 nova_compute[187152]: 2025-11-29 06:58:34.325 187156 INFO nova.compute.manager [None req-37247e92-e66a-4556-b7c9-558ddb63c011 338655d3fc8345a58b781bcd1fbd2a28 3ccf59a5cab0438ca4f0506e49513e87 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Retrieving diagnostics#033[00m
Nov 29 01:58:34 np0005539504 nova_compute[187152]: 2025-11-29 06:58:34.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:36 np0005539504 nova_compute[187152]: 2025-11-29 06:58:36.188 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.274 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.275 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.275 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.275 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.276 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.289 187156 INFO nova.compute.manager [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Terminating instance#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.306 187156 DEBUG nova.compute.manager [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:58:37 np0005539504 kernel: tap58d29b48-4b (unregistering): left promiscuous mode
Nov 29 01:58:37 np0005539504 NetworkManager[55210]: <info>  [1764399517.3402] device (tap58d29b48-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:58:37 np0005539504 ovn_controller[95182]: 2025-11-29T06:58:37Z|00131|binding|INFO|Releasing lport 58d29b48-4b4d-4014-93d0-3ecea2472a3e from this chassis (sb_readonly=0)
Nov 29 01:58:37 np0005539504 ovn_controller[95182]: 2025-11-29T06:58:37Z|00132|binding|INFO|Setting lport 58d29b48-4b4d-4014-93d0-3ecea2472a3e down in Southbound
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.364 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 ovn_controller[95182]: 2025-11-29T06:58:37Z|00133|binding|INFO|Removing iface tap58d29b48-4b ovn-installed in OVS
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.370 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.388 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000025.scope: Deactivated successfully.
Nov 29 01:58:37 np0005539504 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000025.scope: Consumed 19.585s CPU time.
Nov 29 01:58:37 np0005539504 systemd-machined[153423]: Machine qemu-21-instance-00000025 terminated.
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.416 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:5e:37 10.100.0.6'], port_security=['fa:16:3e:33:5e:37 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6c18039e-ddd3-49b6-8323-00aca3672fd8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09ce75c2-edf6-4d0e-b148-55edc758c529', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=58d29b48-4b4d-4014-93d0-3ecea2472a3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.419 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 58d29b48-4b4d-4014-93d0-3ecea2472a3e in datapath db691b6b-17b7-42a9-9fd2-162233da0513 unbound from our chassis#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.422 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db691b6b-17b7-42a9-9fd2-162233da0513, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.425 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[143d1bb0-0bb3-4275-a1d0-002521197abd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.426 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace which is not needed anymore#033[00m
Nov 29 01:58:37 np0005539504 podman[220606]: 2025-11-29 06:58:37.474367249 +0000 UTC m=+0.074786035 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:58:37 np0005539504 podman[220608]: 2025-11-29 06:58:37.506514969 +0000 UTC m=+0.108205520 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.537 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.542 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [NOTICE]   (219724) : haproxy version is 2.8.14-c23fe91
Nov 29 01:58:37 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [NOTICE]   (219724) : path to executable is /usr/sbin/haproxy
Nov 29 01:58:37 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [WARNING]  (219724) : Exiting Master process...
Nov 29 01:58:37 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [ALERT]    (219724) : Current worker (219730) exited with code 143 (Terminated)
Nov 29 01:58:37 np0005539504 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[219711]: [WARNING]  (219724) : All workers exited. Exiting... (0)
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.587 187156 INFO nova.virt.libvirt.driver [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Instance destroyed successfully.#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.588 187156 DEBUG nova.objects.instance [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'resources' on Instance uuid 6c18039e-ddd3-49b6-8323-00aca3672fd8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:37 np0005539504 systemd[1]: libpod-fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb.scope: Deactivated successfully.
Nov 29 01:58:37 np0005539504 podman[220674]: 2025-11-29 06:58:37.594890021 +0000 UTC m=+0.072172914 container died fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:58:37 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb-userdata-shm.mount: Deactivated successfully.
Nov 29 01:58:37 np0005539504 systemd[1]: var-lib-containers-storage-overlay-832f21a6ed560dce8d91edbc256c26ac563e750278627b35104fd2ef2522adc1-merged.mount: Deactivated successfully.
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.638 187156 DEBUG nova.virt.libvirt.vif [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:56:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-2103374378',display_name='tempest-ServerActionsTestOtherA-server-2103374378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-2103374378',id=37,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFRevGxjsP9FW7SRDkRRpVnU4KRGjdG8wIqosLqm4cPhV5Ico8MnUjtpEPPC4qcZ56HVkRqnO8GdidXOybS7oqkeGA17vnM2tYQ+0MqhSrzE91pVD9ZLc4wlyl+Nziiouw==',key_name='tempest-keypair-671368078',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:56:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-nnkzgy00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:56:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='812d926ee4ed4159b2e88b7a69990423',uuid=6c18039e-ddd3-49b6-8323-00aca3672fd8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.639 187156 DEBUG nova.network.os_vif_util [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "address": "fa:16:3e:33:5e:37", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58d29b48-4b", "ovs_interfaceid": "58d29b48-4b4d-4014-93d0-3ecea2472a3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.641 187156 DEBUG nova.network.os_vif_util [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.641 187156 DEBUG os_vif [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.644 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58d29b48-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 podman[220674]: 2025-11-29 06:58:37.653336644 +0000 UTC m=+0.130619547 container cleanup fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.655 187156 INFO os_vif [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:5e:37,bridge_name='br-int',has_traffic_filtering=True,id=58d29b48-4b4d-4014-93d0-3ecea2472a3e,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58d29b48-4b')#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.656 187156 INFO nova.virt.libvirt.driver [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Deleting instance files /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8_del#033[00m
Nov 29 01:58:37 np0005539504 systemd[1]: libpod-conmon-fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb.scope: Deactivated successfully.
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.664 187156 INFO nova.virt.libvirt.driver [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Deletion of /var/lib/nova/instances/6c18039e-ddd3-49b6-8323-00aca3672fd8_del complete#033[00m
Nov 29 01:58:37 np0005539504 podman[220716]: 2025-11-29 06:58:37.728609312 +0000 UTC m=+0.050809787 container remove fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.734 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[83ae7855-9dd3-4acd-a011-a1528816bd74]: (4, ('Sat Nov 29 06:58:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb)\nfee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb\nSat Nov 29 06:58:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (fee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb)\nfee56adef7d0b9adbe99517f0d2a8fe1ce8ea6ec617082abd83feceff0f355fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.736 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad2b48c-40e4-4237-8ab3-88d71ad94c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.737 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.739 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 kernel: tapdb691b6b-10: left promiscuous mode
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.753 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.758 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[17f2c90d-670b-41a1-861d-b65e91d8ab44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.771 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d9de9115-d5be-48da-9f6a-2baf111d31d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.773 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[221d0d08-f24c-4502-be90-e5546720d15a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.791 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[149ce7a0-a957-48e2-866b-78b58c998ae8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486826, 'reachable_time': 37646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220731, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 systemd[1]: run-netns-ovnmeta\x2ddb691b6b\x2d17b7\x2d42a9\x2d9fd2\x2d162233da0513.mount: Deactivated successfully.
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.798 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:58:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:37.799 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6d2c34-b156-4eec-94f9-5482646abace]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.856 187156 INFO nova.compute.manager [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.857 187156 DEBUG oslo.service.loopingcall [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.858 187156 DEBUG nova.compute.manager [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:58:37 np0005539504 nova_compute[187152]: 2025-11-29 06:58:37.858 187156 DEBUG nova.network.neutron [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.469 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.548 187156 DEBUG nova.compute.manager [req-c706e0c0-4169-41a9-99bf-cbb5a6793d0a req-c06eac35-67dd-484a-abf4-b982a7cf849c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-vif-unplugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.549 187156 DEBUG oslo_concurrency.lockutils [req-c706e0c0-4169-41a9-99bf-cbb5a6793d0a req-c06eac35-67dd-484a-abf4-b982a7cf849c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.550 187156 DEBUG oslo_concurrency.lockutils [req-c706e0c0-4169-41a9-99bf-cbb5a6793d0a req-c06eac35-67dd-484a-abf4-b982a7cf849c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.550 187156 DEBUG oslo_concurrency.lockutils [req-c706e0c0-4169-41a9-99bf-cbb5a6793d0a req-c06eac35-67dd-484a-abf4-b982a7cf849c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.550 187156 DEBUG nova.compute.manager [req-c706e0c0-4169-41a9-99bf-cbb5a6793d0a req-c06eac35-67dd-484a-abf4-b982a7cf849c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] No waiting events found dispatching network-vif-unplugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:38 np0005539504 nova_compute[187152]: 2025-11-29 06:58:38.551 187156 DEBUG nova.compute.manager [req-c706e0c0-4169-41a9-99bf-cbb5a6793d0a req-c06eac35-67dd-484a-abf4-b982a7cf849c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-vif-unplugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:58:39 np0005539504 nova_compute[187152]: 2025-11-29 06:58:39.344 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.850 187156 DEBUG nova.compute.manager [req-60d949ca-3d62-4323-9b61-91433150f563 req-b3c865ba-190b-48b6-8124-dd0de8fb86f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-vif-deleted-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.852 187156 INFO nova.compute.manager [req-60d949ca-3d62-4323-9b61-91433150f563 req-b3c865ba-190b-48b6-8124-dd0de8fb86f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Neutron deleted interface 58d29b48-4b4d-4014-93d0-3ecea2472a3e; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.852 187156 DEBUG nova.network.neutron [req-60d949ca-3d62-4323-9b61-91433150f563 req-b3c865ba-190b-48b6-8124-dd0de8fb86f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.856 187156 DEBUG nova.compute.manager [req-7499e60e-162a-4343-87f4-9c1c0755ac33 req-2d393e06-002b-4aa5-aa2e-0051d65c6c45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.856 187156 DEBUG oslo_concurrency.lockutils [req-7499e60e-162a-4343-87f4-9c1c0755ac33 req-2d393e06-002b-4aa5-aa2e-0051d65c6c45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.857 187156 DEBUG oslo_concurrency.lockutils [req-7499e60e-162a-4343-87f4-9c1c0755ac33 req-2d393e06-002b-4aa5-aa2e-0051d65c6c45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.857 187156 DEBUG oslo_concurrency.lockutils [req-7499e60e-162a-4343-87f4-9c1c0755ac33 req-2d393e06-002b-4aa5-aa2e-0051d65c6c45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.858 187156 DEBUG nova.compute.manager [req-7499e60e-162a-4343-87f4-9c1c0755ac33 req-2d393e06-002b-4aa5-aa2e-0051d65c6c45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] No waiting events found dispatching network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.858 187156 WARNING nova.compute.manager [req-7499e60e-162a-4343-87f4-9c1c0755ac33 req-2d393e06-002b-4aa5-aa2e-0051d65c6c45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Received unexpected event network-vif-plugged-58d29b48-4b4d-4014-93d0-3ecea2472a3e for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.861 187156 DEBUG nova.network.neutron [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.930 187156 DEBUG nova.compute.manager [req-60d949ca-3d62-4323-9b61-91433150f563 req-b3c865ba-190b-48b6-8124-dd0de8fb86f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Detach interface failed, port_id=58d29b48-4b4d-4014-93d0-3ecea2472a3e, reason: Instance 6c18039e-ddd3-49b6-8323-00aca3672fd8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 01:58:40 np0005539504 nova_compute[187152]: 2025-11-29 06:58:40.935 187156 INFO nova.compute.manager [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Took 3.08 seconds to deallocate network for instance.#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.129 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.130 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.272 187156 DEBUG nova.compute.provider_tree [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.288 187156 DEBUG nova.scheduler.client.report [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.335 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.383 187156 INFO nova.scheduler.client.report [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Deleted allocations for instance 6c18039e-ddd3-49b6-8323-00aca3672fd8#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.485 187156 DEBUG oslo_concurrency.lockutils [None req-26bf57ca-0746-4ae4-8659-8e5d94a1bc69 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "6c18039e-ddd3-49b6-8323-00aca3672fd8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:41 np0005539504 nova_compute[187152]: 2025-11-29 06:58:41.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:42 np0005539504 nova_compute[187152]: 2025-11-29 06:58:42.696 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:42 np0005539504 podman[220732]: 2025-11-29 06:58:42.776639147 +0000 UTC m=+0.067389155 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:58:42 np0005539504 nova_compute[187152]: 2025-11-29 06:58:42.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:43 np0005539504 nova_compute[187152]: 2025-11-29 06:58:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:43 np0005539504 nova_compute[187152]: 2025-11-29 06:58:43.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:44 np0005539504 nova_compute[187152]: 2025-11-29 06:58:44.346 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:44 np0005539504 nova_compute[187152]: 2025-11-29 06:58:44.595 187156 DEBUG nova.compute.manager [None req-3b1c1432-1944-4e7b-9ff5-10c756c3a611 338655d3fc8345a58b781bcd1fbd2a28 3ccf59a5cab0438ca4f0506e49513e87 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:44 np0005539504 nova_compute[187152]: 2025-11-29 06:58:44.601 187156 INFO nova.compute.manager [None req-3b1c1432-1944-4e7b-9ff5-10c756c3a611 338655d3fc8345a58b781bcd1fbd2a28 3ccf59a5cab0438ca4f0506e49513e87 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Retrieving diagnostics#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.072 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.073 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.074 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.074 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.074 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.088 187156 INFO nova.compute.manager [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Terminating instance#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.100 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "refresh_cache-0fcd1c0b-29f4-4451-9bef-8dea4570678e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.101 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquired lock "refresh_cache-0fcd1c0b-29f4-4451-9bef-8dea4570678e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.102 187156 DEBUG nova.network.neutron [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.478 187156 DEBUG nova.network.neutron [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:58:45 np0005539504 nova_compute[187152]: 2025-11-29 06:58:45.522 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:46 np0005539504 nova_compute[187152]: 2025-11-29 06:58:46.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:47 np0005539504 nova_compute[187152]: 2025-11-29 06:58:47.017 187156 DEBUG nova.network.neutron [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:47 np0005539504 nova_compute[187152]: 2025-11-29 06:58:47.700 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:47 np0005539504 podman[220771]: 2025-11-29 06:58:47.723093863 +0000 UTC m=+0.064526608 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.962 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7d8df8d3efa34120ae55e1082eb12e50', 'user_id': '797e6873bac8436499ebeac55b50a12a', 'hostId': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.976 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.977 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bf2e907-0193-4e2c-a992-f1e9f7f0aac3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:47.965614', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9304a6-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.90029021, 'message_signature': '8eda33303cf18c999a85a67cca910b109280d2fea746dec9d068b7920a57b2d7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:47.965614', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da931a40-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.90029021, 'message_signature': 'e0df5fc08e12f672d649f72616101b289035d690b92c143f6af58534560b28ca'}]}, 'timestamp': '2025-11-29 06:58:47.977812', '_unique_id': '6cbf7a42700448589628421df22cbb57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.980 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.982 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.982 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>]
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:47.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.009 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.010 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab5793f0-203c-4c31-87f3-8def611444b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:47.983175', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da980c30-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'deb4fc8377f489a243a6a214479e801702679ab37ef4833397caec328ed66b16'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:47.983175', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da981e64-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'e44f14a6630aa6d9ab30d67293ab712fdc511bd1d3a4ea37ae58e8207594d17d'}]}, 'timestamp': '2025-11-29 06:58:48.010664', '_unique_id': 'b8733776ff08418a957cdee18ab47f2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.012 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.013 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>]
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.013 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.write.requests volume: 299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.013 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f405414c-f5e2-4456-a0e2-400fdb6865e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 299, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.013488', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da989a06-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'd166e56993ac3dd3b88ef32888805acaf6c8d7da7cf6401085b3b53c425a28c4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.013488', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da98a5d2-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': '2f3617185b3f57fafa9e533b03cfdb670f85db46aa9be404b0a3aff69a92a557'}]}, 'timestamp': '2025-11-29 06:58:48.014146', '_unique_id': '36af9dcb7927446891042fd4c826ec62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.014 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.016 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.read.bytes volume: 29104640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.017 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa6b3e88-8056-4cac-816f-3883c2d6b740', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29104640, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.016565', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da991742-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': '4fb71250c384da995a2666401c80569bab238ee66f1e7a01bc5b62df9272b241'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.016565', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da992caa-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': '086807a58a6b41d4e20542a3658a4e285519fc1c60d53caf0596961ba433beed'}]}, 'timestamp': '2025-11-29 06:58:48.017650', '_unique_id': '31ceb8bc538a41f7b160f0bae875458d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.020 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.021 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.read.requests volume: 1045 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.021 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbb411d3-95f0-40c9-a603-1f87ec4693d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1045, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.020964', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da99c34a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'c6182b8d9d4eeecebaae46cb9a2d753259fd950680349179abfc1763103fc3e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.020964', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da99d3f8-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'ba4518915c4c05ba8d21b02364e4df3c6e3354bc17d79564905c0bd445560ab0'}]}, 'timestamp': '2025-11-29 06:58:48.021893', '_unique_id': '21d5b54b303d442bbfa34dc72a44d989'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.022 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.024 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.024 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ba16f46-cddf-485a-94f3-20e7396317d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.024346', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9a45a4-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.90029021, 'message_signature': '651c5709112f2d82f26f2b0d798bf84ad899edaa63ff0dbf1e61860e376f8ef0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.024346', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9a5648-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.90029021, 'message_signature': '66b35088ae636879851ebf0ba84a565ef947e2a38fc8b9a93dfc044f270c1660'}]}, 'timestamp': '2025-11-29 06:58:48.025204', '_unique_id': 'b863d64d324c4172903854635c08b7b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.026 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.028 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.write.latency volume: 4300684649 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.029 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ed7bcc-f3f8-46fd-b3eb-6415f0ca22d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4300684649, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.028262', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9ae63a-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': '0e31513caa502c315e79be3f70f313a6c57d3b7b45c26934c38ae328f3be2c06'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.028262', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9af8be-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'c3c8f390c461d2961c304ee2f7069a0dbdc649f5015b2347d1ae10b542931081'}]}, 'timestamp': '2025-11-29 06:58:48.029368', '_unique_id': 'd3f509353caf4e8ca4feee05491c7424'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.052 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/cpu volume: 12260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '509e37a3-48dc-4f34-a5f3-f37a0f263f30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12260000000, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'timestamp': '2025-11-29T06:58:48.032990', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da9e9fb4-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.987319405, 'message_signature': '365937591dbda801245c36cff4f39352619c7f99804ea175c4abe4f92076e75f'}]}, 'timestamp': '2025-11-29 06:58:48.053328', '_unique_id': 'fa9b4c7894db4c11bc7639691225a336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.055 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/memory.usage volume: 40.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d88d127-ae25-4766-bebf-630289c0c46b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4140625, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'timestamp': '2025-11-29T06:58:48.055183', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da9ef608-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.987319405, 'message_signature': '069d007a4c9af16a25e5fb6dbcf04beab6a70837b9d1e136d03d096a16f9c522'}]}, 'timestamp': '2025-11-29 06:58:48.055477', '_unique_id': '90a7f2be39ec4db881b8f9e8fc35f572'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>]
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.057 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.057 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd975b764-4c44-49cb-838c-aaab092e0ba3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.057034', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9f3d0c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.90029021, 'message_signature': '9adfb4002e3af89188133060719f44157f77de062f9fae79a59d285f48af3552'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.057034', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9f466c-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.90029021, 'message_signature': '4b37fa1ddf269ef29f75cd806b890341a2ec67783437ed55e691a6ead6d91bc4'}]}, 'timestamp': '2025-11-29 06:58:48.057505', '_unique_id': '7e509588f3834a36bf5cb3caef139a4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.058 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.read.latency volume: 229772390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 DEBUG ceilometer.compute.pollsters [-] 0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk.device.read.latency volume: 25069090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1156c17c-219e-409a-adc6-c2ea138b9669', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 229772390, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-vda', 'timestamp': '2025-11-29T06:58:48.058921', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9f8758-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': 'bb457b58d0194e127a49aa666c1c0b8d6a926b00c7bc3a96b0ecb96a20f37604'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25069090, 'user_id': '797e6873bac8436499ebeac55b50a12a', 'user_name': None, 'project_id': '7d8df8d3efa34120ae55e1082eb12e50', 'project_name': None, 'resource_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e-sda', 'timestamp': '2025-11-29T06:58:48.058921', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsV248Test-server-558365222', 'name': 'instance-0000002e', 'instance_id': '0fcd1c0b-29f4-4451-9bef-8dea4570678e', 'instance_type': 'm1.nano', 'host': '23c077422a6f90ef41f65c6a4d49ca4cc2ba1ed45571a11626c11072', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9f8f78-ccf0-11f0-8a11-fa163ea726b4', 'monotonic_time': 4990.917872256, 'message_signature': '29344969f9f58ca72770c3cd0ed57db5ed629866672e0ae3bd9236982315f05c'}]}, 'timestamp': '2025-11-29 06:58:48.059402', '_unique_id': '68b81474d507454bbd8a1d1383d79ebd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.060 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:58:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 06:58:48.060 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsV248Test-server-558365222>]
Nov 29 01:58:49 np0005539504 nova_compute[187152]: 2025-11-29 06:58:49.350 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:50 np0005539504 nova_compute[187152]: 2025-11-29 06:58:50.345 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:50 np0005539504 nova_compute[187152]: 2025-11-29 06:58:50.346 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:50 np0005539504 nova_compute[187152]: 2025-11-29 06:58:50.346 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:50 np0005539504 nova_compute[187152]: 2025-11-29 06:58:50.346 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.004 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Releasing lock "refresh_cache-0fcd1c0b-29f4-4451-9bef-8dea4570678e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.005 187156 DEBUG nova.compute.manager [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:58:51 np0005539504 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Nov 29 01:58:51 np0005539504 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002e.scope: Consumed 13.848s CPU time.
Nov 29 01:58:51 np0005539504 systemd-machined[153423]: Machine qemu-23-instance-0000002e terminated.
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.260 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.284 187156 INFO nova.virt.libvirt.driver [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Instance destroyed successfully.#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.286 187156 DEBUG nova.objects.instance [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lazy-loading 'resources' on Instance uuid 0fcd1c0b-29f4-4451-9bef-8dea4570678e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.333 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.334 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.400 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.575 187156 INFO nova.virt.libvirt.driver [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Deleting instance files /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e_del#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.577 187156 INFO nova.virt.libvirt.driver [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Deletion of /var/lib/nova/instances/0fcd1c0b-29f4-4451-9bef-8dea4570678e_del complete#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.608 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.610 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5590MB free_disk=73.1723518371582GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.610 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:51 np0005539504 nova_compute[187152]: 2025-11-29 06:58:51.610 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:52 np0005539504 nova_compute[187152]: 2025-11-29 06:58:52.584 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399517.5823214, 6c18039e-ddd3-49b6-8323-00aca3672fd8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:52 np0005539504 nova_compute[187152]: 2025-11-29 06:58:52.585 187156 INFO nova.compute.manager [-] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:58:52 np0005539504 nova_compute[187152]: 2025-11-29 06:58:52.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:53 np0005539504 nova_compute[187152]: 2025-11-29 06:58:53.397 187156 DEBUG nova.compute.manager [None req-ff1f9eba-fc2e-459f-a6a3-33a8785cc17d - - - - - -] [instance: 6c18039e-ddd3-49b6-8323-00aca3672fd8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:53 np0005539504 nova_compute[187152]: 2025-11-29 06:58:53.450 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:53.451 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:58:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:53.452 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:58:53 np0005539504 nova_compute[187152]: 2025-11-29 06:58:53.573 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 0fcd1c0b-29f4-4451-9bef-8dea4570678e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:58:53 np0005539504 nova_compute[187152]: 2025-11-29 06:58:53.574 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:58:53 np0005539504 nova_compute[187152]: 2025-11-29 06:58:53.574 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:58:53 np0005539504 nova_compute[187152]: 2025-11-29 06:58:53.636 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:54 np0005539504 nova_compute[187152]: 2025-11-29 06:58:54.371 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:58:54.454 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:54 np0005539504 nova_compute[187152]: 2025-11-29 06:58:54.593 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:55 np0005539504 nova_compute[187152]: 2025-11-29 06:58:55.466 187156 INFO nova.compute.manager [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Took 4.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:58:55 np0005539504 nova_compute[187152]: 2025-11-29 06:58:55.467 187156 DEBUG oslo.service.loopingcall [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:58:55 np0005539504 nova_compute[187152]: 2025-11-29 06:58:55.467 187156 DEBUG nova.compute.manager [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:58:55 np0005539504 nova_compute[187152]: 2025-11-29 06:58:55.467 187156 DEBUG nova.network.neutron [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:58:55 np0005539504 nova_compute[187152]: 2025-11-29 06:58:55.626 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:58:55 np0005539504 nova_compute[187152]: 2025-11-29 06:58:55.627 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:56 np0005539504 nova_compute[187152]: 2025-11-29 06:58:56.501 187156 DEBUG nova.network.neutron [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:58:56 np0005539504 podman[220810]: 2025-11-29 06:58:56.7644166 +0000 UTC m=+0.095990449 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 29 01:58:57 np0005539504 nova_compute[187152]: 2025-11-29 06:58:57.627 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:57 np0005539504 nova_compute[187152]: 2025-11-29 06:58:57.628 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:58:57 np0005539504 nova_compute[187152]: 2025-11-29 06:58:57.628 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:58:57 np0005539504 nova_compute[187152]: 2025-11-29 06:58:57.708 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:59 np0005539504 nova_compute[187152]: 2025-11-29 06:58:59.375 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:00 np0005539504 podman[220831]: 2025-11-29 06:59:00.719121459 +0000 UTC m=+0.061398013 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Nov 29 01:59:00 np0005539504 podman[220830]: 2025-11-29 06:59:00.723090467 +0000 UTC m=+0.060416217 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.105 187156 DEBUG nova.network.neutron [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.108 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.108 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.108 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.109 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.109 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:59:01 np0005539504 nova_compute[187152]: 2025-11-29 06:59:01.201 187156 INFO nova.compute.manager [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Took 5.73 seconds to deallocate network for instance.#033[00m
Nov 29 01:59:02 np0005539504 nova_compute[187152]: 2025-11-29 06:59:02.710 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:02 np0005539504 nova_compute[187152]: 2025-11-29 06:59:02.869 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:02 np0005539504 nova_compute[187152]: 2025-11-29 06:59:02.869 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:02 np0005539504 nova_compute[187152]: 2025-11-29 06:59:02.996 187156 DEBUG nova.compute.provider_tree [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:03 np0005539504 nova_compute[187152]: 2025-11-29 06:59:03.015 187156 DEBUG nova.scheduler.client.report [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:03 np0005539504 nova_compute[187152]: 2025-11-29 06:59:03.035 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:03 np0005539504 nova_compute[187152]: 2025-11-29 06:59:03.089 187156 INFO nova.scheduler.client.report [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Deleted allocations for instance 0fcd1c0b-29f4-4451-9bef-8dea4570678e#033[00m
Nov 29 01:59:03 np0005539504 nova_compute[187152]: 2025-11-29 06:59:03.270 187156 DEBUG oslo_concurrency.lockutils [None req-6b0a1f3e-bcb2-46f5-9400-c4ae0618e334 797e6873bac8436499ebeac55b50a12a 7d8df8d3efa34120ae55e1082eb12e50 - - default default] Lock "0fcd1c0b-29f4-4451-9bef-8dea4570678e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 18.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:03 np0005539504 nova_compute[187152]: 2025-11-29 06:59:03.600 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:03 np0005539504 nova_compute[187152]: 2025-11-29 06:59:03.741 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:04 np0005539504 nova_compute[187152]: 2025-11-29 06:59:04.376 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:06 np0005539504 nova_compute[187152]: 2025-11-29 06:59:06.258 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399531.2562861, 0fcd1c0b-29f4-4451-9bef-8dea4570678e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:06 np0005539504 nova_compute[187152]: 2025-11-29 06:59:06.259 187156 INFO nova.compute.manager [-] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:59:06 np0005539504 nova_compute[187152]: 2025-11-29 06:59:06.299 187156 DEBUG nova.compute.manager [None req-2c9f0c73-e893-4337-a385-752b50fe9e85 - - - - - -] [instance: 0fcd1c0b-29f4-4451-9bef-8dea4570678e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:07 np0005539504 nova_compute[187152]: 2025-11-29 06:59:07.714 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:07 np0005539504 podman[220875]: 2025-11-29 06:59:07.718327726 +0000 UTC m=+0.062144094 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:59:07 np0005539504 podman[220876]: 2025-11-29 06:59:07.765868573 +0000 UTC m=+0.104145691 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 01:59:09 np0005539504 nova_compute[187152]: 2025-11-29 06:59:09.378 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:12 np0005539504 nova_compute[187152]: 2025-11-29 06:59:12.718 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:13 np0005539504 podman[220922]: 2025-11-29 06:59:13.711856237 +0000 UTC m=+0.055541423 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:59:14 np0005539504 nova_compute[187152]: 2025-11-29 06:59:14.381 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:17 np0005539504 nova_compute[187152]: 2025-11-29 06:59:17.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:18 np0005539504 podman[220943]: 2025-11-29 06:59:18.706322883 +0000 UTC m=+0.051645438 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 01:59:19 np0005539504 nova_compute[187152]: 2025-11-29 06:59:19.382 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:22 np0005539504 nova_compute[187152]: 2025-11-29 06:59:22.724 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:59:22.914 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:59:22.914 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:59:22.914 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:24 np0005539504 nova_compute[187152]: 2025-11-29 06:59:24.384 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:27 np0005539504 podman[220964]: 2025-11-29 06:59:27.698793558 +0000 UTC m=+0.046708595 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:59:27 np0005539504 nova_compute[187152]: 2025-11-29 06:59:27.727 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:29 np0005539504 nova_compute[187152]: 2025-11-29 06:59:29.388 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:31 np0005539504 podman[220983]: 2025-11-29 06:59:31.709645856 +0000 UTC m=+0.056931992 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:59:31 np0005539504 podman[220984]: 2025-11-29 06:59:31.717279992 +0000 UTC m=+0.060698783 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:59:32 np0005539504 nova_compute[187152]: 2025-11-29 06:59:32.730 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.271 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "b1d4680c-ccb8-4982-9d9a-a670404969d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.272 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "b1d4680c-ccb8-4982-9d9a-a670404969d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.320 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.502 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.502 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.512 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.512 187156 INFO nova.compute.claims [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.760 187156 DEBUG nova.compute.provider_tree [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.780 187156 DEBUG nova.scheduler.client.report [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.812 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.813 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.929 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.930 187156 DEBUG nova.network.neutron [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:59:33 np0005539504 nova_compute[187152]: 2025-11-29 06:59:33.965 187156 INFO nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.008 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.202 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.203 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.204 187156 INFO nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Creating image(s)#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.204 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "/var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.205 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "/var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.205 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "/var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.219 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.273 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.274 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.275 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.286 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.354 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.355 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.389 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.394 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.395 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.395 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.458 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.459 187156 DEBUG nova.virt.disk.api [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Checking if we can resize image /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.459 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.530 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.531 187156 DEBUG nova.virt.disk.api [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Cannot resize image /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.532 187156 DEBUG nova.objects.instance [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lazy-loading 'migration_context' on Instance uuid b1d4680c-ccb8-4982-9d9a-a670404969d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.686 187156 DEBUG nova.network.neutron [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.686 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.871 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.871 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Ensure instance console log exists: /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.872 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.872 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.873 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.874 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.880 187156 WARNING nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.886 187156 DEBUG nova.virt.libvirt.host [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.887 187156 DEBUG nova.virt.libvirt.host [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.890 187156 DEBUG nova.virt.libvirt.host [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.891 187156 DEBUG nova.virt.libvirt.host [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.892 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.893 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.893 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.894 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.894 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.894 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.894 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.895 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.895 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.895 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.896 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.896 187156 DEBUG nova.virt.hardware [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:59:34 np0005539504 nova_compute[187152]: 2025-11-29 06:59:34.902 187156 DEBUG nova.objects.instance [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lazy-loading 'pci_devices' on Instance uuid b1d4680c-ccb8-4982-9d9a-a670404969d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:35 np0005539504 nova_compute[187152]: 2025-11-29 06:59:35.089 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <uuid>b1d4680c-ccb8-4982-9d9a-a670404969d4</uuid>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <name>instance-00000030</name>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-1346661535</nova:name>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 06:59:34</nova:creationTime>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:user uuid="7b0cd0dbfb994b8580449eac1a2a118f">tempest-ServerDiagnosticsNegativeTest-1255849127-project-member</nova:user>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:        <nova:project uuid="857193b1f539426fb02244afc230fa9c">tempest-ServerDiagnosticsNegativeTest-1255849127</nova:project>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <system>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <entry name="serial">b1d4680c-ccb8-4982-9d9a-a670404969d4</entry>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <entry name="uuid">b1d4680c-ccb8-4982-9d9a-a670404969d4</entry>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </system>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <os>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </os>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <features>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </features>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </clock>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  <devices>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.config"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </disk>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/console.log" append="off"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </serial>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <video>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </video>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </rng>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 01:59:35 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 01:59:35 np0005539504 nova_compute[187152]:  </devices>
Nov 29 01:59:35 np0005539504 nova_compute[187152]: </domain>
Nov 29 01:59:35 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:59:36 np0005539504 nova_compute[187152]: 2025-11-29 06:59:36.591 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:59:36 np0005539504 nova_compute[187152]: 2025-11-29 06:59:36.592 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:59:36 np0005539504 nova_compute[187152]: 2025-11-29 06:59:36.592 187156 INFO nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Using config drive#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.065 187156 INFO nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Creating config drive at /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.config#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.070 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmg2owjz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.199 187156 DEBUG oslo_concurrency.processutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmg2owjz" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:37 np0005539504 systemd-machined[153423]: New machine qemu-24-instance-00000030.
Nov 29 01:59:37 np0005539504 systemd[1]: Started Virtual Machine qemu-24-instance-00000030.
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.733 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:37 np0005539504 podman[221066]: 2025-11-29 06:59:37.851639974 +0000 UTC m=+0.066643965 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.878 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399577.878117, b1d4680c-ccb8-4982-9d9a-a670404969d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.879 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.884 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.886 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.895 187156 INFO nova.virt.libvirt.driver [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Instance spawned successfully.#033[00m
Nov 29 01:59:37 np0005539504 nova_compute[187152]: 2025-11-29 06:59:37.896 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:59:37 np0005539504 podman[221068]: 2025-11-29 06:59:37.899182021 +0000 UTC m=+0.107443890 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:59:39 np0005539504 nova_compute[187152]: 2025-11-29 06:59:39.392 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:41 np0005539504 nova_compute[187152]: 2025-11-29 06:59:41.185 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:41 np0005539504 nova_compute[187152]: 2025-11-29 06:59:41.190 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.619 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.619 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.620 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.620 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.621 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.621 187156 DEBUG nova.virt.libvirt.driver [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.736 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.745 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.745 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399577.882611, b1d4680c-ccb8-4982-9d9a-a670404969d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.746 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] VM Started (Lifecycle Event)#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.772 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.777 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.791 187156 INFO nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Took 8.59 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.792 187156 DEBUG nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.822 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.903 187156 INFO nova.compute.manager [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Took 9.48 seconds to build instance.#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:42 np0005539504 nova_compute[187152]: 2025-11-29 06:59:42.939 187156 DEBUG oslo_concurrency.lockutils [None req-202409dd-d368-4a41-b139-177f01cb670b 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "b1d4680c-ccb8-4982-9d9a-a670404969d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:43 np0005539504 nova_compute[187152]: 2025-11-29 06:59:43.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:43 np0005539504 nova_compute[187152]: 2025-11-29 06:59:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:43 np0005539504 nova_compute[187152]: 2025-11-29 06:59:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:44 np0005539504 nova_compute[187152]: 2025-11-29 06:59:44.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:44 np0005539504 podman[221121]: 2025-11-29 06:59:44.75297079 +0000 UTC m=+0.094617702 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.024 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "b1d4680c-ccb8-4982-9d9a-a670404969d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.025 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "b1d4680c-ccb8-4982-9d9a-a670404969d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.026 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "b1d4680c-ccb8-4982-9d9a-a670404969d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.026 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "b1d4680c-ccb8-4982-9d9a-a670404969d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.027 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "b1d4680c-ccb8-4982-9d9a-a670404969d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.253 187156 INFO nova.compute.manager [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Terminating instance#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.270 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "refresh_cache-b1d4680c-ccb8-4982-9d9a-a670404969d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.272 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquired lock "refresh_cache-b1d4680c-ccb8-4982-9d9a-a670404969d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.272 187156 DEBUG nova.network.neutron [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:59:45 np0005539504 nova_compute[187152]: 2025-11-29 06:59:45.696 187156 DEBUG nova.network.neutron [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.554 187156 DEBUG nova.network.neutron [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.595 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Releasing lock "refresh_cache-b1d4680c-ccb8-4982-9d9a-a670404969d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.596 187156 DEBUG nova.compute.manager [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:59:46 np0005539504 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000030.scope: Deactivated successfully.
Nov 29 01:59:46 np0005539504 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000030.scope: Consumed 9.383s CPU time.
Nov 29 01:59:46 np0005539504 systemd-machined[153423]: Machine qemu-24-instance-00000030 terminated.
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.854 187156 INFO nova.virt.libvirt.driver [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Instance destroyed successfully.#033[00m
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.854 187156 DEBUG nova.objects.instance [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lazy-loading 'resources' on Instance uuid b1d4680c-ccb8-4982-9d9a-a670404969d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.960 187156 INFO nova.virt.libvirt.driver [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Deleting instance files /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4_del#033[00m
Nov 29 01:59:46 np0005539504 nova_compute[187152]: 2025-11-29 06:59:46.961 187156 INFO nova.virt.libvirt.driver [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Deletion of /var/lib/nova/instances/b1d4680c-ccb8-4982-9d9a-a670404969d4_del complete#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.121 187156 INFO nova.compute.manager [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.122 187156 DEBUG oslo.service.loopingcall [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.122 187156 DEBUG nova.compute.manager [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.123 187156 DEBUG nova.network.neutron [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.535 187156 DEBUG nova.network.neutron [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.548 187156 DEBUG nova.network.neutron [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.585 187156 INFO nova.compute.manager [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Took 0.46 seconds to deallocate network for instance.#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.695 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.695 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.739 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.778 187156 DEBUG nova.compute.provider_tree [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.809 187156 DEBUG nova.scheduler.client.report [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.838 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.871 187156 INFO nova.scheduler.client.report [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Deleted allocations for instance b1d4680c-ccb8-4982-9d9a-a670404969d4#033[00m
Nov 29 01:59:47 np0005539504 nova_compute[187152]: 2025-11-29 06:59:47.984 187156 DEBUG oslo_concurrency.lockutils [None req-dfea1040-2515-4c5c-9b39-faa482979f54 7b0cd0dbfb994b8580449eac1a2a118f 857193b1f539426fb02244afc230fa9c - - default default] Lock "b1d4680c-ccb8-4982-9d9a-a670404969d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.955 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.956 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.988 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.988 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.988 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:48 np0005539504 nova_compute[187152]: 2025-11-29 06:59:48.989 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:59:49 np0005539504 podman[221155]: 2025-11-29 06:59:49.110255686 +0000 UTC m=+0.069272906 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.191 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.192 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5733MB free_disk=73.20109558105469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.193 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.193 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.277 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.277 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.313 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.344 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.380 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.381 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:49 np0005539504 nova_compute[187152]: 2025-11-29 06:59:49.397 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:50 np0005539504 nova_compute[187152]: 2025-11-29 06:59:50.363 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:50 np0005539504 nova_compute[187152]: 2025-11-29 06:59:50.364 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:59:50 np0005539504 nova_compute[187152]: 2025-11-29 06:59:50.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.028 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.029 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.109 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.306 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.306 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.312 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.313 187156 INFO nova.compute.claims [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.654 187156 DEBUG nova.compute.provider_tree [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.912 187156 DEBUG nova.scheduler.client.report [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:51 np0005539504 nova_compute[187152]: 2025-11-29 06:59:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.250 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.252 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.474 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.474 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.484 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.485 187156 DEBUG nova.network.neutron [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.539 187156 INFO nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.542 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.743 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.806 187156 DEBUG nova.policy [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:59:52 np0005539504 nova_compute[187152]: 2025-11-29 06:59:52.830 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.093 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.094 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.095 187156 INFO nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Creating image(s)#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.095 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.096 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.097 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.113 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.173 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.174 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.174 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.184 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.202 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.203 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.213 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.214 187156 INFO nova.compute.claims [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.239 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.240 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.319 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk 1073741824" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.320 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.321 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.377 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.378 187156 DEBUG nova.virt.disk.api [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.378 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.399 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.443 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.444 187156 DEBUG nova.virt.disk.api [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.445 187156 DEBUG nova.objects.instance [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid 19ab76ac-c167-48b4-a7e6-c6777e78515d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.569 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.569 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Ensure instance console log exists: /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.570 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.570 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.570 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:59:54.769 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.769 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:59:54.770 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.856 187156 DEBUG nova.compute.provider_tree [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.890 187156 DEBUG nova.scheduler.client.report [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.964 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:54 np0005539504 nova_compute[187152]: 2025-11-29 06:59:54.965 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.159 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.160 187156 DEBUG nova.network.neutron [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.204 187156 INFO nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.233 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.530 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.532 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.532 187156 INFO nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Creating image(s)#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.533 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "/var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.533 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "/var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.534 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "/var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.552 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.620 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.621 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.621 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.632 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.683 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.684 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.716 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.717 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.717 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.770 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.771 187156 DEBUG nova.virt.disk.api [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Checking if we can resize image /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.771 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 06:59:55.772 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.821 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.822 187156 DEBUG nova.virt.disk.api [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Cannot resize image /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.822 187156 DEBUG nova.objects.instance [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lazy-loading 'migration_context' on Instance uuid 0c19fe31-f0ac-478f-948e-ded3a8631c00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.851 187156 DEBUG nova.network.neutron [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Successfully created port: f539ab0e-0738-4ff6-9f12-28f7776d7cfc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.856 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.856 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Ensure instance console log exists: /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.857 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.857 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:55 np0005539504 nova_compute[187152]: 2025-11-29 06:59:55.857 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:56 np0005539504 nova_compute[187152]: 2025-11-29 06:59:56.686 187156 DEBUG nova.policy [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9cb56a6cc9ad4326a65ec3b3fe352836', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32c21e2c4b044d569a10a87f8282bd09', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:59:57 np0005539504 nova_compute[187152]: 2025-11-29 06:59:57.746 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.114 187156 DEBUG nova.network.neutron [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Successfully updated port: f539ab0e-0738-4ff6-9f12-28f7776d7cfc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.240 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-19ab76ac-c167-48b4-a7e6-c6777e78515d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.241 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-19ab76ac-c167-48b4-a7e6-c6777e78515d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.241 187156 DEBUG nova.network.neutron [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.349 187156 DEBUG nova.network.neutron [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Successfully created port: 693b9d3b-aebc-4e09-94e6-0ad650b19511 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.449 187156 DEBUG nova.compute.manager [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-changed-f539ab0e-0738-4ff6-9f12-28f7776d7cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.449 187156 DEBUG nova.compute.manager [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Refreshing instance network info cache due to event network-changed-f539ab0e-0738-4ff6-9f12-28f7776d7cfc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.450 187156 DEBUG oslo_concurrency.lockutils [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-19ab76ac-c167-48b4-a7e6-c6777e78515d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:59:58 np0005539504 nova_compute[187152]: 2025-11-29 06:59:58.601 187156 DEBUG nova.network.neutron [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:59:58 np0005539504 podman[221207]: 2025-11-29 06:59:58.718295717 +0000 UTC m=+0.059389218 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:59:59 np0005539504 nova_compute[187152]: 2025-11-29 06:59:59.401 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.358 187156 DEBUG nova.network.neutron [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Successfully updated port: 693b9d3b-aebc-4e09-94e6-0ad650b19511 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.427 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "refresh_cache-0c19fe31-f0ac-478f-948e-ded3a8631c00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.428 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquired lock "refresh_cache-0c19fe31-f0ac-478f-948e-ded3a8631c00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.428 187156 DEBUG nova.network.neutron [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.530 187156 DEBUG nova.compute.manager [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-changed-693b9d3b-aebc-4e09-94e6-0ad650b19511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.531 187156 DEBUG nova.compute.manager [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Refreshing instance network info cache due to event network-changed-693b9d3b-aebc-4e09-94e6-0ad650b19511. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:00:00 np0005539504 nova_compute[187152]: 2025-11-29 07:00:00.532 187156 DEBUG oslo_concurrency.lockutils [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0c19fe31-f0ac-478f-948e-ded3a8631c00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.232 187156 DEBUG nova.network.neutron [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.386 187156 DEBUG nova.network.neutron [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Updating instance_info_cache with network_info: [{"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.543 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-19ab76ac-c167-48b4-a7e6-c6777e78515d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.543 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Instance network_info: |[{"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.544 187156 DEBUG oslo_concurrency.lockutils [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-19ab76ac-c167-48b4-a7e6-c6777e78515d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.544 187156 DEBUG nova.network.neutron [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Refreshing network info cache for port f539ab0e-0738-4ff6-9f12-28f7776d7cfc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.546 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Start _get_guest_xml network_info=[{"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.551 187156 WARNING nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.572 187156 DEBUG nova.virt.libvirt.host [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.573 187156 DEBUG nova.virt.libvirt.host [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.582 187156 DEBUG nova.virt.libvirt.host [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.583 187156 DEBUG nova.virt.libvirt.host [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.584 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.585 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.585 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.585 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.586 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.586 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.586 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.587 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.587 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.587 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.587 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.588 187156 DEBUG nova.virt.hardware [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.592 187156 DEBUG nova.virt.libvirt.vif [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1267050979',display_name='tempest-DeleteServersTestJSON-server-1267050979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1267050979',id=49,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-xzzs9tuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:59:52Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=19ab76ac-c167-48b4-a7e6-c6777e78515d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.593 187156 DEBUG nova.network.os_vif_util [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.594 187156 DEBUG nova.network.os_vif_util [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.595 187156 DEBUG nova.objects.instance [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid 19ab76ac-c167-48b4-a7e6-c6777e78515d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.627 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <uuid>19ab76ac-c167-48b4-a7e6-c6777e78515d</uuid>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <name>instance-00000031</name>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:name>tempest-DeleteServersTestJSON-server-1267050979</nova:name>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:00:01</nova:creationTime>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        <nova:port uuid="f539ab0e-0738-4ff6-9f12-28f7776d7cfc">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <entry name="serial">19ab76ac-c167-48b4-a7e6-c6777e78515d</entry>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <entry name="uuid">19ab76ac-c167-48b4-a7e6-c6777e78515d</entry>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.config"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:eb:5a:45"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <target dev="tapf539ab0e-07"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/console.log" append="off"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:00:01 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:00:01 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:00:01 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:00:01 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.629 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Preparing to wait for external event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.629 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.629 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.630 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.630 187156 DEBUG nova.virt.libvirt.vif [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1267050979',display_name='tempest-DeleteServersTestJSON-server-1267050979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1267050979',id=49,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-xzzs9tuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:59:52Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=19ab76ac-c167-48b4-a7e6-c6777e78515d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.631 187156 DEBUG nova.network.os_vif_util [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.631 187156 DEBUG nova.network.os_vif_util [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.632 187156 DEBUG os_vif [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.632 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.633 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.633 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.638 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf539ab0e-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.638 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf539ab0e-07, col_values=(('external_ids', {'iface-id': 'f539ab0e-0738-4ff6-9f12-28f7776d7cfc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:5a:45', 'vm-uuid': '19ab76ac-c167-48b4-a7e6-c6777e78515d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.640 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:01 np0005539504 NetworkManager[55210]: <info>  [1764399601.6413] manager: (tapf539ab0e-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.643 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.649 187156 INFO os_vif [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07')#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.778 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.779 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.779 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:eb:5a:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.780 187156 INFO nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Using config drive#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.852 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399586.850498, b1d4680c-ccb8-4982-9d9a-a670404969d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.852 187156 INFO nova.compute.manager [-] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:00:01 np0005539504 nova_compute[187152]: 2025-11-29 07:00:01.882 187156 DEBUG nova.compute.manager [None req-ea92adcc-b2da-41b3-a30a-d0930060b423 - - - - - -] [instance: b1d4680c-ccb8-4982-9d9a-a670404969d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:02 np0005539504 podman[221230]: 2025-11-29 07:00:02.720480279 +0000 UTC m=+0.054728682 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:00:02 np0005539504 podman[221231]: 2025-11-29 07:00:02.725915136 +0000 UTC m=+0.058658148 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.617 187156 INFO nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Creating config drive at /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.config#033[00m
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.623 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2x_lxp9m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.750 187156 DEBUG oslo_concurrency.processutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2x_lxp9m" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:03 np0005539504 kernel: tapf539ab0e-07: entered promiscuous mode
Nov 29 02:00:03 np0005539504 NetworkManager[55210]: <info>  [1764399603.8200] manager: (tapf539ab0e-07): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Nov 29 02:00:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:03Z|00134|binding|INFO|Claiming lport f539ab0e-0738-4ff6-9f12-28f7776d7cfc for this chassis.
Nov 29 02:00:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:03Z|00135|binding|INFO|f539ab0e-0738-4ff6-9f12-28f7776d7cfc: Claiming fa:16:3e:eb:5a:45 10.100.0.9
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.820 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.822 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.827 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:03 np0005539504 systemd-udevd[221292]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:03 np0005539504 systemd-machined[153423]: New machine qemu-25-instance-00000031.
Nov 29 02:00:03 np0005539504 NetworkManager[55210]: <info>  [1764399603.8659] device (tapf539ab0e-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:00:03 np0005539504 NetworkManager[55210]: <info>  [1764399603.8670] device (tapf539ab0e-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:03Z|00136|binding|INFO|Setting lport f539ab0e-0738-4ff6-9f12-28f7776d7cfc ovn-installed in OVS
Nov 29 02:00:03 np0005539504 nova_compute[187152]: 2025-11-29 07:00:03.884 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:03 np0005539504 systemd[1]: Started Virtual Machine qemu-25-instance-00000031.
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.402 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.459 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399604.459487, 19ab76ac-c167-48b4-a7e6-c6777e78515d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.460 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] VM Started (Lifecycle Event)#033[00m
Nov 29 02:00:04 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:04Z|00137|binding|INFO|Setting lport f539ab0e-0738-4ff6-9f12-28f7776d7cfc up in Southbound
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.501 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:5a:45 10.100.0.9'], port_security=['fa:16:3e:eb:5a:45 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '19ab76ac-c167-48b4-a7e6-c6777e78515d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=f539ab0e-0738-4ff6-9f12-28f7776d7cfc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.504 104164 INFO neutron.agent.ovn.metadata.agent [-] Port f539ab0e-0738-4ff6-9f12-28f7776d7cfc in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.505 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.518 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e78adae1-41f5-4dd9-b72e-491984fe4b2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.520 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.522 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.523 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dd92d61d-13bf-4f68-8c19-dea0e04982b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.524 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[389fdbba-7cab-4b60-98fb-d5cf1934b6a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.537 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9f5d79-6a9f-4a51-ac56-7c170e42ffd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.552 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[afbc10c5-35d3-4c92-808c-901c75835e3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.582 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[9da57e6a-d5b6-4e55-8236-9acd9cbeb17c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.587 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fa19938b-b251-4b6f-ab7d-e5f4f6800c41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 systemd-udevd[221295]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:04 np0005539504 NetworkManager[55210]: <info>  [1764399604.5890] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.615 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[34d7a0a0-c34f-4da4-a915-49502f337ad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.619 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e6706d19-a1ac-4997-b4b4-a51a979ae98b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 NetworkManager[55210]: <info>  [1764399604.6488] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.653 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[16e23980-5a8b-4f0d-843b-8a42acc723b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.677 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4695240d-81ef-4f32-b577-38226dda33bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506752, 'reachable_time': 23146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221333, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.701 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6e536837-7b3a-4d51-b8e5-d920d78cba1e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506752, 'tstamp': 506752}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221334, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.726 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[84940e54-e30b-455a-868b-b6e9dc2ab2d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506752, 'reachable_time': 23146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221335, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.772 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f9fa103d-5499-4fc6-8606-d694d0c863b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.842 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2b18e8b3-663c-4f30-b4fa-391f1b05b7a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.843 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.844 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.844 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:04 np0005539504 NetworkManager[55210]: <info>  [1764399604.8472] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Nov 29 02:00:04 np0005539504 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.846 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.849 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.850 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.851 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:04 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:04Z|00138|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=1)
Nov 29 02:00:04 np0005539504 nova_compute[187152]: 2025-11-29 07:00:04.862 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.863 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.864 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8630e3-9c24-4e0e-bc46-f53e812f7ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.865 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:00:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:04.866 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:00:05 np0005539504 nova_compute[187152]: 2025-11-29 07:00:05.067 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:05 np0005539504 nova_compute[187152]: 2025-11-29 07:00:05.075 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399604.4596603, 19ab76ac-c167-48b4-a7e6-c6777e78515d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:05 np0005539504 nova_compute[187152]: 2025-11-29 07:00:05.075 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:00:05 np0005539504 podman[221368]: 2025-11-29 07:00:05.274731275 +0000 UTC m=+0.032376158 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:00:05 np0005539504 nova_compute[187152]: 2025-11-29 07:00:05.687 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:05 np0005539504 nova_compute[187152]: 2025-11-29 07:00:05.691 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:05 np0005539504 podman[221368]: 2025-11-29 07:00:05.694719654 +0000 UTC m=+0.452364437 container create ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:00:06 np0005539504 nova_compute[187152]: 2025-11-29 07:00:06.042 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:06 np0005539504 systemd[1]: Started libpod-conmon-ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50.scope.
Nov 29 02:00:06 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:00:06 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f98a74f072bb4de2d00f5e86cc9b3a520262aa4d94cfb94b10ab5bb03333c382/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:00:06 np0005539504 podman[221368]: 2025-11-29 07:00:06.175965312 +0000 UTC m=+0.933610125 container init ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:00:06 np0005539504 podman[221368]: 2025-11-29 07:00:06.182001306 +0000 UTC m=+0.939646099 container start ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:00:06 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [NOTICE]   (221388) : New worker (221390) forked
Nov 29 02:00:06 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [NOTICE]   (221388) : Loading success.
Nov 29 02:00:06 np0005539504 nova_compute[187152]: 2025-11-29 07:00:06.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.069 187156 DEBUG nova.compute.manager [req-9491d196-12af-4395-9fc6-7a223ce36d9f req-fc93ad58-9171-4b0a-9fb2-c7730479c95a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.070 187156 DEBUG oslo_concurrency.lockutils [req-9491d196-12af-4395-9fc6-7a223ce36d9f req-fc93ad58-9171-4b0a-9fb2-c7730479c95a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.070 187156 DEBUG oslo_concurrency.lockutils [req-9491d196-12af-4395-9fc6-7a223ce36d9f req-fc93ad58-9171-4b0a-9fb2-c7730479c95a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.070 187156 DEBUG oslo_concurrency.lockutils [req-9491d196-12af-4395-9fc6-7a223ce36d9f req-fc93ad58-9171-4b0a-9fb2-c7730479c95a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.071 187156 DEBUG nova.compute.manager [req-9491d196-12af-4395-9fc6-7a223ce36d9f req-fc93ad58-9171-4b0a-9fb2-c7730479c95a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Processing event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.072 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.075 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399607.075744, 19ab76ac-c167-48b4-a7e6-c6777e78515d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.076 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.078 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.082 187156 INFO nova.virt.libvirt.driver [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Instance spawned successfully.#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.082 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.418 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.418 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.419 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.419 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.419 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.420 187156 DEBUG nova.virt.libvirt.driver [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.474 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.477 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.629 187156 DEBUG nova.network.neutron [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Updating instance_info_cache with network_info: [{"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.813 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.860 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Releasing lock "refresh_cache-0c19fe31-f0ac-478f-948e-ded3a8631c00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.861 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Instance network_info: |[{"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.862 187156 DEBUG oslo_concurrency.lockutils [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0c19fe31-f0ac-478f-948e-ded3a8631c00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.863 187156 DEBUG nova.network.neutron [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Refreshing network info cache for port 693b9d3b-aebc-4e09-94e6-0ad650b19511 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.868 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Start _get_guest_xml network_info=[{"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.873 187156 WARNING nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.879 187156 DEBUG nova.virt.libvirt.host [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.880 187156 DEBUG nova.virt.libvirt.host [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.884 187156 DEBUG nova.virt.libvirt.host [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.885 187156 DEBUG nova.virt.libvirt.host [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.886 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.887 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.887 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.887 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.888 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.888 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.889 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.889 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.890 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.890 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.891 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.891 187156 DEBUG nova.virt.hardware [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.897 187156 DEBUG nova.virt.libvirt.vif [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1438657295',display_name='tempest-ServerAddressesNegativeTestJSON-server-1438657295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1438657295',id=50,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32c21e2c4b044d569a10a87f8282bd09',ramdisk_id='',reservation_id='r-v76kvxqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1595840823',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1595840823-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:59:55Z,user_data=None,user_id='9cb56a6cc9ad4326a65ec3b3fe352836',uuid=0c19fe31-f0ac-478f-948e-ded3a8631c00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.898 187156 DEBUG nova.network.os_vif_util [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Converting VIF {"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.899 187156 DEBUG nova.network.os_vif_util [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.901 187156 DEBUG nova.objects.instance [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0c19fe31-f0ac-478f-948e-ded3a8631c00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.928 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <uuid>0c19fe31-f0ac-478f-948e-ded3a8631c00</uuid>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <name>instance-00000032</name>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1438657295</nova:name>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:00:07</nova:creationTime>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:user uuid="9cb56a6cc9ad4326a65ec3b3fe352836">tempest-ServerAddressesNegativeTestJSON-1595840823-project-member</nova:user>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:project uuid="32c21e2c4b044d569a10a87f8282bd09">tempest-ServerAddressesNegativeTestJSON-1595840823</nova:project>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        <nova:port uuid="693b9d3b-aebc-4e09-94e6-0ad650b19511">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <entry name="serial">0c19fe31-f0ac-478f-948e-ded3a8631c00</entry>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <entry name="uuid">0c19fe31-f0ac-478f-948e-ded3a8631c00</entry>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.config"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:cb:58:cc"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <target dev="tap693b9d3b-ae"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/console.log" append="off"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:00:07 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:00:07 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:00:07 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:00:07 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.929 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Preparing to wait for external event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.930 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.930 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.930 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.931 187156 DEBUG nova.virt.libvirt.vif [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1438657295',display_name='tempest-ServerAddressesNegativeTestJSON-server-1438657295',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1438657295',id=50,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32c21e2c4b044d569a10a87f8282bd09',ramdisk_id='',reservation_id='r-v76kvxqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1595840823',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1595840823-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:59:55Z,user_data=None,user_id='9cb56a6cc9ad4326a65ec3b3fe352836',uuid=0c19fe31-f0ac-478f-948e-ded3a8631c00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.931 187156 DEBUG nova.network.os_vif_util [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Converting VIF {"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.931 187156 DEBUG nova.network.os_vif_util [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.932 187156 DEBUG os_vif [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.932 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.933 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.935 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.935 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap693b9d3b-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.936 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap693b9d3b-ae, col_values=(('external_ids', {'iface-id': '693b9d3b-aebc-4e09-94e6-0ad650b19511', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:58:cc', 'vm-uuid': '0c19fe31-f0ac-478f-948e-ded3a8631c00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.937 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:07 np0005539504 NetworkManager[55210]: <info>  [1764399607.9386] manager: (tap693b9d3b-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.940 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.945 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.945 187156 INFO os_vif [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae')#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.966 187156 INFO nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Took 13.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:00:07 np0005539504 nova_compute[187152]: 2025-11-29 07:00:07.967 187156 DEBUG nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:08 np0005539504 podman[221402]: 2025-11-29 07:00:08.064729753 +0000 UTC m=+0.082404672 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.071 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.072 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.072 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] No VIF found with MAC fa:16:3e:cb:58:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.073 187156 INFO nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Using config drive#033[00m
Nov 29 02:00:08 np0005539504 podman[221403]: 2025-11-29 07:00:08.091337933 +0000 UTC m=+0.105934999 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.133 187156 INFO nova.compute.manager [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Took 16.89 seconds to build instance.#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.179 187156 DEBUG oslo_concurrency.lockutils [None req-565e258d-3e1b-4bcb-884f-a3040999679b 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.656 187156 DEBUG nova.network.neutron [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Updated VIF entry in instance network info cache for port f539ab0e-0738-4ff6-9f12-28f7776d7cfc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.657 187156 DEBUG nova.network.neutron [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Updating instance_info_cache with network_info: [{"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:08 np0005539504 nova_compute[187152]: 2025-11-29 07:00:08.698 187156 DEBUG oslo_concurrency.lockutils [req-7aba3266-8575-45c2-90e8-910c26783a3a req-765959b6-3135-4143-a253-e7a980441875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-19ab76ac-c167-48b4-a7e6-c6777e78515d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.114 187156 INFO nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Creating config drive at /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.config#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.120 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8fuzh0p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.245 187156 DEBUG oslo_concurrency.processutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8fuzh0p" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.259 187156 DEBUG nova.compute.manager [req-9f0056c6-b4b0-4721-b57d-6bc6a683b5b1 req-b5442a36-1f38-4ad1-9227-d870c0294ca5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.259 187156 DEBUG oslo_concurrency.lockutils [req-9f0056c6-b4b0-4721-b57d-6bc6a683b5b1 req-b5442a36-1f38-4ad1-9227-d870c0294ca5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.260 187156 DEBUG oslo_concurrency.lockutils [req-9f0056c6-b4b0-4721-b57d-6bc6a683b5b1 req-b5442a36-1f38-4ad1-9227-d870c0294ca5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.261 187156 DEBUG oslo_concurrency.lockutils [req-9f0056c6-b4b0-4721-b57d-6bc6a683b5b1 req-b5442a36-1f38-4ad1-9227-d870c0294ca5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.261 187156 DEBUG nova.compute.manager [req-9f0056c6-b4b0-4721-b57d-6bc6a683b5b1 req-b5442a36-1f38-4ad1-9227-d870c0294ca5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] No waiting events found dispatching network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.261 187156 WARNING nova.compute.manager [req-9f0056c6-b4b0-4721-b57d-6bc6a683b5b1 req-b5442a36-1f38-4ad1-9227-d870c0294ca5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received unexpected event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc for instance with vm_state active and task_state None.#033[00m
Nov 29 02:00:09 np0005539504 kernel: tap693b9d3b-ae: entered promiscuous mode
Nov 29 02:00:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:09Z|00139|binding|INFO|Claiming lport 693b9d3b-aebc-4e09-94e6-0ad650b19511 for this chassis.
Nov 29 02:00:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:09Z|00140|binding|INFO|693b9d3b-aebc-4e09-94e6-0ad650b19511: Claiming fa:16:3e:cb:58:cc 10.100.0.13
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.318 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 NetworkManager[55210]: <info>  [1764399609.3212] manager: (tap693b9d3b-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.336 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:58:cc 10.100.0.13'], port_security=['fa:16:3e:cb:58:cc 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0c19fe31-f0ac-478f-948e-ded3a8631c00', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32c21e2c4b044d569a10a87f8282bd09', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7032c59c-6e9a-42ef-b162-33d9fe2507d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=66cf9662-1380-4c7f-a316-cff963614013, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=693b9d3b-aebc-4e09-94e6-0ad650b19511) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.338 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 693b9d3b-aebc-4e09-94e6-0ad650b19511 in datapath f679e8fd-0521-46e9-b8e3-afe8d09e2d88 bound to our chassis#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.340 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f679e8fd-0521-46e9-b8e3-afe8d09e2d88#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.353 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[acd62de9-3b8e-44a9-93c3-30fdfed5dbd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.354 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf679e8fd-01 in ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.356 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf679e8fd-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.357 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3287ca78-6db0-4511-8f4c-fe937a1e7ecf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.357 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[417851fc-427f-4319-a004-f59eed28364c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 systemd-udevd[221468]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:09 np0005539504 NetworkManager[55210]: <info>  [1764399609.3743] device (tap693b9d3b-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:00:09 np0005539504 NetworkManager[55210]: <info>  [1764399609.3754] device (tap693b9d3b-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:00:09 np0005539504 systemd-machined[153423]: New machine qemu-26-instance-00000032.
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.379 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:09Z|00141|binding|INFO|Setting lport 693b9d3b-aebc-4e09-94e6-0ad650b19511 ovn-installed in OVS
Nov 29 02:00:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:09Z|00142|binding|INFO|Setting lport 693b9d3b-aebc-4e09-94e6-0ad650b19511 up in Southbound
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.384 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.383 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[39aae329-5312-491b-92bc-8cbe757904a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 systemd[1]: Started Virtual Machine qemu-26-instance-00000032.
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.408 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9ce571e9-1756-4e74-81e0-1c86d9bcdb1b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.436 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9356be-db70-4c86-8066-6a1f1a73244d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 systemd-udevd[221472]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:09 np0005539504 NetworkManager[55210]: <info>  [1764399609.4455] manager: (tapf679e8fd-00): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.444 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc1fda0-251b-4177-a423-abdd44fc0adf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.487 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb7af59-d1ba-499c-b668-2333e48de9cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.492 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[05f3ddfa-8114-4e3b-97e0-a79ee6c081c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 NetworkManager[55210]: <info>  [1764399609.5126] device (tapf679e8fd-00): carrier: link connected
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.516 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0801033a-0e64-4d73-b42d-cedcb38864ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.529 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d7662354-95b5-4c55-be20-da75d90cae6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf679e8fd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:32:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507239, 'reachable_time': 40569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221501, 'error': None, 'target': 'ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.547 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e386bd72-5fed-45d8-960a-48fd4b0c4712]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:32e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 507239, 'tstamp': 507239}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221502, 'error': None, 'target': 'ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.572 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[de89ed58-a8d2-4bc5-a26e-64c4b5b83483]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf679e8fd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:32:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507239, 'reachable_time': 40569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221503, 'error': None, 'target': 'ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.610 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8c90efc8-c3ff-4eb2-babe-7d64cb29ec92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.678 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1b25e6-8d46-46ee-b966-fbc7eeb16abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.680 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf679e8fd-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.680 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.680 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf679e8fd-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:09 np0005539504 kernel: tapf679e8fd-00: entered promiscuous mode
Nov 29 02:00:09 np0005539504 NetworkManager[55210]: <info>  [1764399609.6838] manager: (tapf679e8fd-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.683 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.687 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.688 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf679e8fd-00, col_values=(('external_ids', {'iface-id': 'fea955cb-3680-42d7-9bba-38273cae05b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.690 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:09Z|00143|binding|INFO|Releasing lport fea955cb-3680-42d7-9bba-38273cae05b9 from this chassis (sb_readonly=0)
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.696 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f679e8fd-0521-46e9-b8e3-afe8d09e2d88.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f679e8fd-0521-46e9-b8e3-afe8d09e2d88.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.698 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2264ae7f-0fdb-4444-81a8-c1ecca6af710]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.699 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-f679e8fd-0521-46e9-b8e3-afe8d09e2d88
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/f679e8fd-0521-46e9-b8e3-afe8d09e2d88.pid.haproxy
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID f679e8fd-0521-46e9-b8e3-afe8d09e2d88
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:00:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:09.700 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'env', 'PROCESS_TAG=haproxy-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f679e8fd-0521-46e9-b8e3-afe8d09e2d88.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:00:09 np0005539504 nova_compute[187152]: 2025-11-29 07:00:09.702 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.013 187156 INFO nova.compute.manager [None req-9c2504c9-5226-4691-8ea0-d5d6feb608b6 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Pausing#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.014 187156 DEBUG nova.objects.instance [None req-9c2504c9-5226-4691-8ea0-d5d6feb608b6 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'flavor' on Instance uuid 19ab76ac-c167-48b4-a7e6-c6777e78515d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.027 187156 DEBUG nova.compute.manager [req-e386e72e-2d70-43f3-95b6-51f6158ba6ee req-e84fefc8-e64e-49b7-b16c-6aa9a2a087e8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.028 187156 DEBUG oslo_concurrency.lockutils [req-e386e72e-2d70-43f3-95b6-51f6158ba6ee req-e84fefc8-e64e-49b7-b16c-6aa9a2a087e8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.028 187156 DEBUG oslo_concurrency.lockutils [req-e386e72e-2d70-43f3-95b6-51f6158ba6ee req-e84fefc8-e64e-49b7-b16c-6aa9a2a087e8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.028 187156 DEBUG oslo_concurrency.lockutils [req-e386e72e-2d70-43f3-95b6-51f6158ba6ee req-e84fefc8-e64e-49b7-b16c-6aa9a2a087e8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.029 187156 DEBUG nova.compute.manager [req-e386e72e-2d70-43f3-95b6-51f6158ba6ee req-e84fefc8-e64e-49b7-b16c-6aa9a2a087e8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Processing event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.101 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399610.1001618, 19ab76ac-c167-48b4-a7e6-c6777e78515d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.101 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.103 187156 DEBUG nova.compute.manager [None req-9c2504c9-5226-4691-8ea0-d5d6feb608b6 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.106 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.112 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.118 187156 INFO nova.virt.libvirt.driver [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Instance spawned successfully.#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.119 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:00:10 np0005539504 podman[221541]: 2025-11-29 07:00:10.122825968 +0000 UTC m=+0.076894373 container create 728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.136 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.140 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:10 np0005539504 podman[221541]: 2025-11-29 07:00:10.082292331 +0000 UTC m=+0.036360766 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:00:10 np0005539504 systemd[1]: Started libpod-conmon-728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff.scope.
Nov 29 02:00:10 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:00:10 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c88acd2095d80044cdc46976ea0564cc127ccd4d01daa5ac1b62c849ba2650d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:00:10 np0005539504 podman[221541]: 2025-11-29 07:00:10.222720042 +0000 UTC m=+0.176788477 container init 728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.222 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.223 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.224 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.224 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.225 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.226 187156 DEBUG nova.virt.libvirt.driver [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:10 np0005539504 podman[221541]: 2025-11-29 07:00:10.229983408 +0000 UTC m=+0.184051813 container start 728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:00:10 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [NOTICE]   (221560) : New worker (221562) forked
Nov 29 02:00:10 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [NOTICE]   (221560) : Loading success.
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.268 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.269 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399610.1059387, 0c19fe31-f0ac-478f-948e-ded3a8631c00 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.269 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] VM Started (Lifecycle Event)#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.314 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.321 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.349 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.349 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399610.106037, 0c19fe31-f0ac-478f-948e-ded3a8631c00 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.350 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.387 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.390 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399610.1093311, 0c19fe31-f0ac-478f-948e-ded3a8631c00 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.391 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.480 187156 INFO nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Took 14.95 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.481 187156 DEBUG nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.496 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.500 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.546 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.887 187156 INFO nova.compute.manager [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Took 17.91 seconds to build instance.#033[00m
Nov 29 02:00:10 np0005539504 nova_compute[187152]: 2025-11-29 07:00:10.995 187156 DEBUG oslo_concurrency.lockutils [None req-213c2f5c-519d-4c07-b488-9bc91303f599 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:11 np0005539504 nova_compute[187152]: 2025-11-29 07:00:11.068 187156 DEBUG nova.network.neutron [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Updated VIF entry in instance network info cache for port 693b9d3b-aebc-4e09-94e6-0ad650b19511. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:00:11 np0005539504 nova_compute[187152]: 2025-11-29 07:00:11.069 187156 DEBUG nova.network.neutron [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Updating instance_info_cache with network_info: [{"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:11 np0005539504 nova_compute[187152]: 2025-11-29 07:00:11.135 187156 DEBUG oslo_concurrency.lockutils [req-fb2c2a78-458f-412b-bcd3-48a29dcd3be5 req-90b0183e-7c6a-4f1d-9177-d70b8322612b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0c19fe31-f0ac-478f-948e-ded3a8631c00" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.196 187156 DEBUG nova.compute.manager [req-c9cb1264-6f68-45dc-b1fb-217c93ba3d59 req-b9997cb2-7d1a-4d8f-902a-b7c428de6f3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.197 187156 DEBUG oslo_concurrency.lockutils [req-c9cb1264-6f68-45dc-b1fb-217c93ba3d59 req-b9997cb2-7d1a-4d8f-902a-b7c428de6f3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.197 187156 DEBUG oslo_concurrency.lockutils [req-c9cb1264-6f68-45dc-b1fb-217c93ba3d59 req-b9997cb2-7d1a-4d8f-902a-b7c428de6f3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.197 187156 DEBUG oslo_concurrency.lockutils [req-c9cb1264-6f68-45dc-b1fb-217c93ba3d59 req-b9997cb2-7d1a-4d8f-902a-b7c428de6f3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.197 187156 DEBUG nova.compute.manager [req-c9cb1264-6f68-45dc-b1fb-217c93ba3d59 req-b9997cb2-7d1a-4d8f-902a-b7c428de6f3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] No waiting events found dispatching network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.198 187156 WARNING nova.compute.manager [req-c9cb1264-6f68-45dc-b1fb-217c93ba3d59 req-b9997cb2-7d1a-4d8f-902a-b7c428de6f3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received unexpected event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:00:12 np0005539504 nova_compute[187152]: 2025-11-29 07:00:12.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.041 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.042 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.042 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.042 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.042 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.059 187156 INFO nova.compute.manager [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Terminating instance#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.071 187156 DEBUG nova.compute.manager [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:00:14 np0005539504 kernel: tapf539ab0e-07 (unregistering): left promiscuous mode
Nov 29 02:00:14 np0005539504 NetworkManager[55210]: <info>  [1764399614.0960] device (tapf539ab0e-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:00:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:14Z|00144|binding|INFO|Releasing lport f539ab0e-0738-4ff6-9f12-28f7776d7cfc from this chassis (sb_readonly=0)
Nov 29 02:00:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:14Z|00145|binding|INFO|Setting lport f539ab0e-0738-4ff6-9f12-28f7776d7cfc down in Southbound
Nov 29 02:00:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:14Z|00146|binding|INFO|Removing iface tapf539ab0e-07 ovn-installed in OVS
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.146 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.164 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.163 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:5a:45 10.100.0.9'], port_security=['fa:16:3e:eb:5a:45 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '19ab76ac-c167-48b4-a7e6-c6777e78515d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=f539ab0e-0738-4ff6-9f12-28f7776d7cfc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.165 104164 INFO neutron.agent.ovn.metadata.agent [-] Port f539ab0e-0738-4ff6-9f12-28f7776d7cfc in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.167 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.168 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0327b7f9-2204-4374-86d2-9b81ced58cbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.168 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore#033[00m
Nov 29 02:00:14 np0005539504 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000031.scope: Deactivated successfully.
Nov 29 02:00:14 np0005539504 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000031.scope: Consumed 3.598s CPU time.
Nov 29 02:00:14 np0005539504 systemd-machined[153423]: Machine qemu-25-instance-00000031 terminated.
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [NOTICE]   (221388) : haproxy version is 2.8.14-c23fe91
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [NOTICE]   (221388) : path to executable is /usr/sbin/haproxy
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [WARNING]  (221388) : Exiting Master process...
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [ALERT]    (221388) : Current worker (221390) exited with code 143 (Terminated)
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.293 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221384]: [WARNING]  (221388) : All workers exited. Exiting... (0)
Nov 29 02:00:14 np0005539504 systemd[1]: libpod-ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50.scope: Deactivated successfully.
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.299 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 podman[221596]: 2025-11-29 07:00:14.301498818 +0000 UTC m=+0.048013150 container died ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:00:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50-userdata-shm.mount: Deactivated successfully.
Nov 29 02:00:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay-f98a74f072bb4de2d00f5e86cc9b3a520262aa4d94cfb94b10ab5bb03333c382-merged.mount: Deactivated successfully.
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.339 187156 INFO nova.virt.libvirt.driver [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Instance destroyed successfully.#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.342 187156 DEBUG nova.objects.instance [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid 19ab76ac-c167-48b4-a7e6-c6777e78515d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.357 187156 DEBUG nova.virt.libvirt.vif [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1267050979',display_name='tempest-DeleteServersTestJSON-server-1267050979',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1267050979',id=49,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-xzzs9tuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:00:10Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=19ab76ac-c167-48b4-a7e6-c6777e78515d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:00:14 np0005539504 podman[221596]: 2025-11-29 07:00:14.358347547 +0000 UTC m=+0.104861869 container cleanup ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.358 187156 DEBUG nova.network.os_vif_util [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "address": "fa:16:3e:eb:5a:45", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf539ab0e-07", "ovs_interfaceid": "f539ab0e-0738-4ff6-9f12-28f7776d7cfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.360 187156 DEBUG nova.network.os_vif_util [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.361 187156 DEBUG os_vif [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.363 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.364 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf539ab0e-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.366 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 systemd[1]: libpod-conmon-ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50.scope: Deactivated successfully.
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.368 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.371 187156 INFO os_vif [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:5a:45,bridge_name='br-int',has_traffic_filtering=True,id=f539ab0e-0738-4ff6-9f12-28f7776d7cfc,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf539ab0e-07')#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.372 187156 INFO nova.virt.libvirt.driver [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Deleting instance files /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d_del#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.373 187156 INFO nova.virt.libvirt.driver [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Deletion of /var/lib/nova/instances/19ab76ac-c167-48b4-a7e6-c6777e78515d_del complete#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.405 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 podman[221644]: 2025-11-29 07:00:14.444745216 +0000 UTC m=+0.054222888 container remove ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.455 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a08939c2-2535-4a04-b9bf-ca1f5867f8f2]: (4, ('Sat Nov 29 07:00:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50)\nce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50\nSat Nov 29 07:00:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (ce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50)\nce47d745ec9e0ffe56d61377b3b694ac9f2c89af8c89b7ecf34c0d4bd9590e50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.457 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[952a42a4-f359-463c-b275-2dffdabf9796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.458 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.460 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.473 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.476 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf56bac-e060-4553-b247-eadb86817195]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.495 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[36a322ee-6e53-4ce2-8ff0-3fb64d741716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.497 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[631e71fa-c39a-48c6-ab77-42086d527d62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.514 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[22ab6253-88e3-4fff-945c-1ca4f3e75c44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506745, 'reachable_time': 18702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221660, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.521 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.521 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[e73b34fa-beee-4cdd-85fb-95a0885d5d6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.536 187156 INFO nova.compute.manager [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.537 187156 DEBUG oslo.service.loopingcall [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.537 187156 DEBUG nova.compute.manager [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.537 187156 DEBUG nova.network.neutron [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.708 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.708 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.709 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.709 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.709 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.720 187156 INFO nova.compute.manager [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Terminating instance#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.733 187156 DEBUG nova.compute.manager [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:00:14 np0005539504 kernel: tap693b9d3b-ae (unregistering): left promiscuous mode
Nov 29 02:00:14 np0005539504 NetworkManager[55210]: <info>  [1764399614.7538] device (tap693b9d3b-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:00:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:14Z|00147|binding|INFO|Releasing lport 693b9d3b-aebc-4e09-94e6-0ad650b19511 from this chassis (sb_readonly=0)
Nov 29 02:00:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:14Z|00148|binding|INFO|Setting lport 693b9d3b-aebc-4e09-94e6-0ad650b19511 down in Southbound
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.756 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:14Z|00149|binding|INFO|Removing iface tap693b9d3b-ae ovn-installed in OVS
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.758 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.771 187156 DEBUG nova.compute.manager [req-41b76cba-8b31-4a18-9917-14b5c1da468d req-3b32077a-0cc6-4d68-bafc-7f40c43d5ab3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-vif-unplugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.771 187156 DEBUG oslo_concurrency.lockutils [req-41b76cba-8b31-4a18-9917-14b5c1da468d req-3b32077a-0cc6-4d68-bafc-7f40c43d5ab3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.772 187156 DEBUG oslo_concurrency.lockutils [req-41b76cba-8b31-4a18-9917-14b5c1da468d req-3b32077a-0cc6-4d68-bafc-7f40c43d5ab3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.772 187156 DEBUG oslo_concurrency.lockutils [req-41b76cba-8b31-4a18-9917-14b5c1da468d req-3b32077a-0cc6-4d68-bafc-7f40c43d5ab3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.772 187156 DEBUG nova.compute.manager [req-41b76cba-8b31-4a18-9917-14b5c1da468d req-3b32077a-0cc6-4d68-bafc-7f40c43d5ab3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] No waiting events found dispatching network-vif-unplugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.772 187156 DEBUG nova.compute.manager [req-41b76cba-8b31-4a18-9917-14b5c1da468d req-3b32077a-0cc6-4d68-bafc-7f40c43d5ab3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-vif-unplugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.771 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:58:cc 10.100.0.13'], port_security=['fa:16:3e:cb:58:cc 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0c19fe31-f0ac-478f-948e-ded3a8631c00', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32c21e2c4b044d569a10a87f8282bd09', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7032c59c-6e9a-42ef-b162-33d9fe2507d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=66cf9662-1380-4c7f-a316-cff963614013, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=693b9d3b-aebc-4e09-94e6-0ad650b19511) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.773 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 693b9d3b-aebc-4e09-94e6-0ad650b19511 in datapath f679e8fd-0521-46e9-b8e3-afe8d09e2d88 unbound from our chassis#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.774 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f679e8fd-0521-46e9-b8e3-afe8d09e2d88, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.775 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[844901a1-1fa4-413f-97f9-82285be064a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:14.775 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88 namespace which is not needed anymore#033[00m
Nov 29 02:00:14 np0005539504 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000032.scope: Deactivated successfully.
Nov 29 02:00:14 np0005539504 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000032.scope: Consumed 5.361s CPU time.
Nov 29 02:00:14 np0005539504 systemd-machined[153423]: Machine qemu-26-instance-00000032 terminated.
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [NOTICE]   (221560) : haproxy version is 2.8.14-c23fe91
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [NOTICE]   (221560) : path to executable is /usr/sbin/haproxy
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [WARNING]  (221560) : Exiting Master process...
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [ALERT]    (221560) : Current worker (221562) exited with code 143 (Terminated)
Nov 29 02:00:14 np0005539504 neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88[221556]: [WARNING]  (221560) : All workers exited. Exiting... (0)
Nov 29 02:00:14 np0005539504 systemd[1]: libpod-728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff.scope: Deactivated successfully.
Nov 29 02:00:14 np0005539504 podman[221677]: 2025-11-29 07:00:14.905614763 +0000 UTC m=+0.062566205 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:00:14 np0005539504 podman[221690]: 2025-11-29 07:00:14.908825799 +0000 UTC m=+0.051106485 container died 728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:00:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff-userdata-shm.mount: Deactivated successfully.
Nov 29 02:00:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay-7c88acd2095d80044cdc46976ea0564cc127ccd4d01daa5ac1b62c849ba2650d-merged.mount: Deactivated successfully.
Nov 29 02:00:14 np0005539504 podman[221690]: 2025-11-29 07:00:14.949080849 +0000 UTC m=+0.091361475 container cleanup 728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:14 np0005539504 NetworkManager[55210]: <info>  [1764399614.9504] manager: (tap693b9d3b-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Nov 29 02:00:14 np0005539504 systemd[1]: libpod-conmon-728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff.scope: Deactivated successfully.
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.986 187156 INFO nova.virt.libvirt.driver [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Instance destroyed successfully.#033[00m
Nov 29 02:00:14 np0005539504 nova_compute[187152]: 2025-11-29 07:00:14.986 187156 DEBUG nova.objects.instance [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lazy-loading 'resources' on Instance uuid 0c19fe31-f0ac-478f-948e-ded3a8631c00 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:15 np0005539504 podman[221741]: 2025-11-29 07:00:15.01044605 +0000 UTC m=+0.041882414 container remove 728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.015 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c0dc3230-c49b-47d5-9113-cc93ae00e9be]: (4, ('Sat Nov 29 07:00:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88 (728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff)\n728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff\nSat Nov 29 07:00:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88 (728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff)\n728d0393d6fc892890e1eb075da7f7b9d5627969eb71c3f0ede0950131e28dff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.017 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[19db76df-7521-42fb-9cc2-a86ceb7d6033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.018 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf679e8fd-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.019 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:15 np0005539504 kernel: tapf679e8fd-00: left promiscuous mode
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.022 187156 DEBUG nova.virt.libvirt.vif [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1438657295',display_name='tempest-ServerAddressesNegativeTestJSON-server-1438657295',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1438657295',id=50,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32c21e2c4b044d569a10a87f8282bd09',ramdisk_id='',reservation_id='r-v76kvxqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-1595840823',owner_user_name='tempest-ServerAddressesNegativeTestJSON-1595840823-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:00:10Z,user_data=None,user_id='9cb56a6cc9ad4326a65ec3b3fe352836',uuid=0c19fe31-f0ac-478f-948e-ded3a8631c00,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.022 187156 DEBUG nova.network.os_vif_util [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Converting VIF {"id": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "address": "fa:16:3e:cb:58:cc", "network": {"id": "f679e8fd-0521-46e9-b8e3-afe8d09e2d88", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-42589422-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32c21e2c4b044d569a10a87f8282bd09", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap693b9d3b-ae", "ovs_interfaceid": "693b9d3b-aebc-4e09-94e6-0ad650b19511", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.023 187156 DEBUG nova.network.os_vif_util [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.023 187156 DEBUG os_vif [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.025 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.025 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap693b9d3b-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.028 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.042 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[79ca543a-4114-4207-b28d-baf42739a0ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.044 187156 INFO os_vif [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:58:cc,bridge_name='br-int',has_traffic_filtering=True,id=693b9d3b-aebc-4e09-94e6-0ad650b19511,network=Network(f679e8fd-0521-46e9-b8e3-afe8d09e2d88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap693b9d3b-ae')#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.045 187156 INFO nova.virt.libvirt.driver [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Deleting instance files /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00_del#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.046 187156 INFO nova.virt.libvirt.driver [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Deletion of /var/lib/nova/instances/0c19fe31-f0ac-478f-948e-ded3a8631c00_del complete#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.054 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae43590-f593-448d-8c36-7aa4db7b3d04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.056 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0718100b-bf3c-4439-9ba1-d5499cb0a619]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.069 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ed3c29a5-b620-4923-a4d8-0578a24afc9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507231, 'reachable_time': 43742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221764, 'error': None, 'target': 'ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.071 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f679e8fd-0521-46e9-b8e3-afe8d09e2d88 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:00:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:15.071 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3d471ca4-f40e-4290-a49f-b90c18673ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:15 np0005539504 systemd[1]: run-netns-ovnmeta\x2df679e8fd\x2d0521\x2d46e9\x2db8e3\x2dafe8d09e2d88.mount: Deactivated successfully.
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.647 187156 INFO nova.compute.manager [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.649 187156 DEBUG oslo.service.loopingcall [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.649 187156 DEBUG nova.compute.manager [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:00:15 np0005539504 nova_compute[187152]: 2025-11-29 07:00:15.649 187156 DEBUG nova.network.neutron [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.252 187156 DEBUG nova.compute.manager [req-222821bd-01df-4805-a69e-10bba0d7306d req-9c7b6840-b6f3-45ff-bcec-0fa1ac978e9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-vif-unplugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.253 187156 DEBUG oslo_concurrency.lockutils [req-222821bd-01df-4805-a69e-10bba0d7306d req-9c7b6840-b6f3-45ff-bcec-0fa1ac978e9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.254 187156 DEBUG oslo_concurrency.lockutils [req-222821bd-01df-4805-a69e-10bba0d7306d req-9c7b6840-b6f3-45ff-bcec-0fa1ac978e9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.254 187156 DEBUG oslo_concurrency.lockutils [req-222821bd-01df-4805-a69e-10bba0d7306d req-9c7b6840-b6f3-45ff-bcec-0fa1ac978e9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.254 187156 DEBUG nova.compute.manager [req-222821bd-01df-4805-a69e-10bba0d7306d req-9c7b6840-b6f3-45ff-bcec-0fa1ac978e9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] No waiting events found dispatching network-vif-unplugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.255 187156 DEBUG nova.compute.manager [req-222821bd-01df-4805-a69e-10bba0d7306d req-9c7b6840-b6f3-45ff-bcec-0fa1ac978e9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-vif-unplugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.841 187156 DEBUG nova.network.neutron [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:16 np0005539504 nova_compute[187152]: 2025-11-29 07:00:16.878 187156 INFO nova.compute.manager [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Took 2.34 seconds to deallocate network for instance.#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.073 187156 DEBUG nova.compute.manager [req-15941639-5883-4a56-a686-4dd3f7473236 req-57a2a990-ae91-452f-a60b-d24abc990c8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.073 187156 DEBUG oslo_concurrency.lockutils [req-15941639-5883-4a56-a686-4dd3f7473236 req-57a2a990-ae91-452f-a60b-d24abc990c8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.074 187156 DEBUG oslo_concurrency.lockutils [req-15941639-5883-4a56-a686-4dd3f7473236 req-57a2a990-ae91-452f-a60b-d24abc990c8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.074 187156 DEBUG oslo_concurrency.lockutils [req-15941639-5883-4a56-a686-4dd3f7473236 req-57a2a990-ae91-452f-a60b-d24abc990c8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.074 187156 DEBUG nova.compute.manager [req-15941639-5883-4a56-a686-4dd3f7473236 req-57a2a990-ae91-452f-a60b-d24abc990c8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] No waiting events found dispatching network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.074 187156 WARNING nova.compute.manager [req-15941639-5883-4a56-a686-4dd3f7473236 req-57a2a990-ae91-452f-a60b-d24abc990c8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received unexpected event network-vif-plugged-f539ab0e-0738-4ff6-9f12-28f7776d7cfc for instance with vm_state paused and task_state deleting.#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.076 187156 DEBUG nova.network.neutron [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.098 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.099 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.100 187156 INFO nova.compute.manager [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Took 1.45 seconds to deallocate network for instance.#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.200 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.296 187156 DEBUG nova.compute.manager [req-ea003472-c095-4f4d-8532-2035c6de2d5b req-c8880147-284b-4df6-9ef2-b2fdf34d83c9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-vif-deleted-693b9d3b-aebc-4e09-94e6-0ad650b19511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.349 187156 DEBUG nova.compute.provider_tree [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.372 187156 DEBUG nova.scheduler.client.report [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.419 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.422 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.467 187156 INFO nova.scheduler.client.report [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocations for instance 19ab76ac-c167-48b4-a7e6-c6777e78515d#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.522 187156 DEBUG nova.compute.provider_tree [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.550 187156 DEBUG nova.scheduler.client.report [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.601 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.604 187156 DEBUG oslo_concurrency.lockutils [None req-d66a331a-3f10-422d-b1d4-7299498f9b94 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "19ab76ac-c167-48b4-a7e6-c6777e78515d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.663 187156 INFO nova.scheduler.client.report [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Deleted allocations for instance 0c19fe31-f0ac-478f-948e-ded3a8631c00#033[00m
Nov 29 02:00:17 np0005539504 nova_compute[187152]: 2025-11-29 07:00:17.834 187156 DEBUG oslo_concurrency.lockutils [None req-304b0428-953b-4930-9355-279be848cb30 9cb56a6cc9ad4326a65ec3b3fe352836 32c21e2c4b044d569a10a87f8282bd09 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.657 187156 DEBUG nova.compute.manager [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Received event network-vif-deleted-f539ab0e-0738-4ff6-9f12-28f7776d7cfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.658 187156 DEBUG nova.compute.manager [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.658 187156 DEBUG oslo_concurrency.lockutils [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.658 187156 DEBUG oslo_concurrency.lockutils [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.659 187156 DEBUG oslo_concurrency.lockutils [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c19fe31-f0ac-478f-948e-ded3a8631c00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.659 187156 DEBUG nova.compute.manager [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] No waiting events found dispatching network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:18 np0005539504 nova_compute[187152]: 2025-11-29 07:00:18.659 187156 WARNING nova.compute.manager [req-de2e3f3e-38aa-446a-b988-fcf50838b3e7 req-a6d44ee0-ee8f-4c29-8134-0f124e03c619 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Received unexpected event network-vif-plugged-693b9d3b-aebc-4e09-94e6-0ad650b19511 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:00:19 np0005539504 nova_compute[187152]: 2025-11-29 07:00:19.407 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:19 np0005539504 podman[221765]: 2025-11-29 07:00:19.711400471 +0000 UTC m=+0.057215580 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:20 np0005539504 nova_compute[187152]: 2025-11-29 07:00:20.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:22.914 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:22.916 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:22.916 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:24 np0005539504 nova_compute[187152]: 2025-11-29 07:00:24.410 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:24 np0005539504 nova_compute[187152]: 2025-11-29 07:00:24.692 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:24 np0005539504 nova_compute[187152]: 2025-11-29 07:00:24.692 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:24 np0005539504 nova_compute[187152]: 2025-11-29 07:00:24.726 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.027 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.105 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.106 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.112 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.112 187156 INFO nova.compute.claims [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.363 187156 DEBUG nova.compute.provider_tree [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.386 187156 DEBUG nova.scheduler.client.report [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.442 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.443 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.536 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.537 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.569 187156 INFO nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.603 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.723 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.724 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.725 187156 INFO nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Creating image(s)#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.725 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "/var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.726 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "/var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.726 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "/var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.743 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.819 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.820 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.821 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.832 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.896 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.898 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.939 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.940 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:25 np0005539504 nova_compute[187152]: 2025-11-29 07:00:25.940 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.023 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.024 187156 DEBUG nova.virt.disk.api [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Checking if we can resize image /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.024 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.082 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.083 187156 DEBUG nova.virt.disk.api [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Cannot resize image /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.084 187156 DEBUG nova.objects.instance [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lazy-loading 'migration_context' on Instance uuid 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.091 187156 DEBUG nova.policy [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11e9982557a44d40b2ebaf04bf99c371', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.115 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.116 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Ensure instance console log exists: /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.117 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.117 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:26 np0005539504 nova_compute[187152]: 2025-11-29 07:00:26.118 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:27 np0005539504 nova_compute[187152]: 2025-11-29 07:00:27.439 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:28 np0005539504 nova_compute[187152]: 2025-11-29 07:00:28.045 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Successfully created port: cd85ddcc-46bd-4622-955e-12395bfecf41 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.285 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Successfully created port: 6c0110c6-3514-41dc-a70b-4131d855aeb3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.338 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399614.3360739, 19ab76ac-c167-48b4-a7e6-c6777e78515d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.339 187156 INFO nova.compute.manager [-] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.375 187156 DEBUG nova.compute.manager [None req-58cde878-6eb2-473f-9c86-a406fe5a60d3 - - - - - -] [instance: 19ab76ac-c167-48b4-a7e6-c6777e78515d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.412 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:29 np0005539504 podman[221804]: 2025-11-29 07:00:29.739493611 +0000 UTC m=+0.064092406 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.985 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399614.9838219, 0c19fe31-f0ac-478f-948e-ded3a8631c00 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:29 np0005539504 nova_compute[187152]: 2025-11-29 07:00:29.985 187156 INFO nova.compute.manager [-] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:00:30 np0005539504 nova_compute[187152]: 2025-11-29 07:00:30.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:30 np0005539504 nova_compute[187152]: 2025-11-29 07:00:30.156 187156 DEBUG nova.compute.manager [None req-c611b144-6879-4d04-94b2-a266592b6dc7 - - - - - -] [instance: 0c19fe31-f0ac-478f-948e-ded3a8631c00] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:31 np0005539504 nova_compute[187152]: 2025-11-29 07:00:31.066 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Successfully created port: efebde6a-916e-4248-aa3d-459b872a6adb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:00:32 np0005539504 nova_compute[187152]: 2025-11-29 07:00:32.648 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Successfully updated port: cd85ddcc-46bd-4622-955e-12395bfecf41 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:00:32 np0005539504 nova_compute[187152]: 2025-11-29 07:00:32.835 187156 DEBUG nova.compute.manager [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-changed-cd85ddcc-46bd-4622-955e-12395bfecf41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:32 np0005539504 nova_compute[187152]: 2025-11-29 07:00:32.835 187156 DEBUG nova.compute.manager [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Refreshing instance network info cache due to event network-changed-cd85ddcc-46bd-4622-955e-12395bfecf41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:00:32 np0005539504 nova_compute[187152]: 2025-11-29 07:00:32.836 187156 DEBUG oslo_concurrency.lockutils [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:32 np0005539504 nova_compute[187152]: 2025-11-29 07:00:32.836 187156 DEBUG oslo_concurrency.lockutils [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:32 np0005539504 nova_compute[187152]: 2025-11-29 07:00:32.836 187156 DEBUG nova.network.neutron [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Refreshing network info cache for port cd85ddcc-46bd-4622-955e-12395bfecf41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:33 np0005539504 podman[221823]: 2025-11-29 07:00:33.717357966 +0000 UTC m=+0.056354236 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:00:33 np0005539504 podman[221824]: 2025-11-29 07:00:33.72750574 +0000 UTC m=+0.066370237 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 29 02:00:33 np0005539504 nova_compute[187152]: 2025-11-29 07:00:33.935 187156 DEBUG nova.network.neutron [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:00:34 np0005539504 nova_compute[187152]: 2025-11-29 07:00:34.262 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Successfully updated port: 6c0110c6-3514-41dc-a70b-4131d855aeb3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:00:34 np0005539504 nova_compute[187152]: 2025-11-29 07:00:34.415 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:34 np0005539504 nova_compute[187152]: 2025-11-29 07:00:34.543 187156 DEBUG nova.network.neutron [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:34 np0005539504 nova_compute[187152]: 2025-11-29 07:00:34.581 187156 DEBUG oslo_concurrency.lockutils [req-69c092c3-33c9-4019-a5e7-0cb4f97346b1 req-8b0f7e35-5543-4568-9b23-6cdd715256e1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.031 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.163 187156 DEBUG nova.compute.manager [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-changed-6c0110c6-3514-41dc-a70b-4131d855aeb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.163 187156 DEBUG nova.compute.manager [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Refreshing instance network info cache due to event network-changed-6c0110c6-3514-41dc-a70b-4131d855aeb3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.164 187156 DEBUG oslo_concurrency.lockutils [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.164 187156 DEBUG oslo_concurrency.lockutils [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.164 187156 DEBUG nova.network.neutron [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Refreshing network info cache for port 6c0110c6-3514-41dc-a70b-4131d855aeb3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:35 np0005539504 nova_compute[187152]: 2025-11-29 07:00:35.729 187156 DEBUG nova.network.neutron [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:00:36 np0005539504 nova_compute[187152]: 2025-11-29 07:00:36.277 187156 DEBUG nova.network.neutron [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:36 np0005539504 nova_compute[187152]: 2025-11-29 07:00:36.316 187156 DEBUG oslo_concurrency.lockutils [req-ac82d204-169c-4899-ae04-677ff5be3dd4 req-99806133-0fb9-4f20-9ce0-921913d307cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:36 np0005539504 nova_compute[187152]: 2025-11-29 07:00:36.627 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Successfully updated port: efebde6a-916e-4248-aa3d-459b872a6adb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.152 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.153 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquired lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.153 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.325 187156 DEBUG nova.compute.manager [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-changed-efebde6a-916e-4248-aa3d-459b872a6adb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.326 187156 DEBUG nova.compute.manager [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Refreshing instance network info cache due to event network-changed-efebde6a-916e-4248-aa3d-459b872a6adb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.326 187156 DEBUG oslo_concurrency.lockutils [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:37 np0005539504 nova_compute[187152]: 2025-11-29 07:00:37.424 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:00:38 np0005539504 podman[221868]: 2025-11-29 07:00:38.749984485 +0000 UTC m=+0.093247816 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:00:38 np0005539504 podman[221869]: 2025-11-29 07:00:38.790359507 +0000 UTC m=+0.130189766 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:00:39 np0005539504 nova_compute[187152]: 2025-11-29 07:00:39.458 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:40 np0005539504 nova_compute[187152]: 2025-11-29 07:00:40.033 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.456 187156 DEBUG nova.network.neutron [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updating instance_info_cache with network_info: [{"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.491 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Releasing lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.492 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance network_info: |[{"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.492 187156 DEBUG oslo_concurrency.lockutils [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.492 187156 DEBUG nova.network.neutron [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Refreshing network info cache for port efebde6a-916e-4248-aa3d-459b872a6adb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.497 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Start _get_guest_xml network_info=[{"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.502 187156 WARNING nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.508 187156 DEBUG nova.virt.libvirt.host [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.509 187156 DEBUG nova.virt.libvirt.host [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.517 187156 DEBUG nova.virt.libvirt.host [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.518 187156 DEBUG nova.virt.libvirt.host [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.520 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.520 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.521 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.521 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.521 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.522 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.522 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.522 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.523 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.523 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.523 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.524 187156 DEBUG nova.virt.hardware [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.529 187156 DEBUG nova.virt.libvirt.vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:25Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.529 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.530 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.531 187156 DEBUG nova.virt.libvirt.vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:25Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.532 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.532 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.533 187156 DEBUG nova.virt.libvirt.vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:25Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.533 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.534 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.535 187156 DEBUG nova.objects.instance [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.566 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <uuid>7f54d428-a3ac-4f2f-b5f6-fbaa715502e4</uuid>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <name>instance-00000033</name>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersTestMultiNic-server-214136367</nova:name>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:00:41</nova:creationTime>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:user uuid="11e9982557a44d40b2ebaf04bf99c371">tempest-ServersTestMultiNic-1778452684-project-member</nova:user>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:project uuid="de73e0af4d994da4a30deaebd1a7e86b">tempest-ServersTestMultiNic-1778452684</nova:project>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:port uuid="cd85ddcc-46bd-4622-955e-12395bfecf41">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.154" ipVersion="4"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:port uuid="6c0110c6-3514-41dc-a70b-4131d855aeb3">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.1.70" ipVersion="4"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        <nova:port uuid="efebde6a-916e-4248-aa3d-459b872a6adb">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.85" ipVersion="4"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <entry name="serial">7f54d428-a3ac-4f2f-b5f6-fbaa715502e4</entry>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <entry name="uuid">7f54d428-a3ac-4f2f-b5f6-fbaa715502e4</entry>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.config"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:a2:0f:31"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <target dev="tapcd85ddcc-46"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:f9:6c:2b"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <target dev="tap6c0110c6-35"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:48:3b:b8"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <target dev="tapefebde6a-91"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/console.log" append="off"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:00:41 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:00:41 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:00:41 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:00:41 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.569 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Preparing to wait for external event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.569 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.570 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.570 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.570 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Preparing to wait for external event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.570 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.570 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.571 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.571 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Preparing to wait for external event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.571 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.571 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.571 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.572 187156 DEBUG nova.virt.libvirt.vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:25Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.572 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.573 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.573 187156 DEBUG os_vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.574 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.574 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.575 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.579 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.580 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd85ddcc-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.580 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd85ddcc-46, col_values=(('external_ids', {'iface-id': 'cd85ddcc-46bd-4622-955e-12395bfecf41', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:0f:31', 'vm-uuid': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.583 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 NetworkManager[55210]: <info>  [1764399641.5844] manager: (tapcd85ddcc-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.585 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.590 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.591 187156 INFO os_vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46')#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.592 187156 DEBUG nova.virt.libvirt.vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:25Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.592 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.593 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.593 187156 DEBUG os_vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.593 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.594 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.594 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.597 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.597 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c0110c6-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.598 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6c0110c6-35, col_values=(('external_ids', {'iface-id': '6c0110c6-3514-41dc-a70b-4131d855aeb3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:6c:2b', 'vm-uuid': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 NetworkManager[55210]: <info>  [1764399641.6001] manager: (tap6c0110c6-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.600 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.602 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.606 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.607 187156 INFO os_vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35')#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.608 187156 DEBUG nova.virt.libvirt.vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:25Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.609 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.610 187156 DEBUG nova.network.os_vif_util [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.610 187156 DEBUG os_vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.610 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.611 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.611 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.614 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.614 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefebde6a-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.615 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefebde6a-91, col_values=(('external_ids', {'iface-id': 'efebde6a-916e-4248-aa3d-459b872a6adb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:3b:b8', 'vm-uuid': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:41 np0005539504 NetworkManager[55210]: <info>  [1764399641.6173] manager: (tapefebde6a-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.619 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.632 187156 INFO os_vif [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91')#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.734 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.734 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.734 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No VIF found with MAC fa:16:3e:a2:0f:31, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.734 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No VIF found with MAC fa:16:3e:f9:6c:2b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.735 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No VIF found with MAC fa:16:3e:48:3b:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:41 np0005539504 nova_compute[187152]: 2025-11-29 07:00:41.735 187156 INFO nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Using config drive#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.343 187156 INFO nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Creating config drive at /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.config#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.349 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpht2maa31 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.477 187156 DEBUG oslo_concurrency.processutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpht2maa31" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.5614] manager: (tapcd85ddcc-46): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 29 02:00:42 np0005539504 kernel: tapcd85ddcc-46: entered promiscuous mode
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.5842] manager: (tap6c0110c6-35): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 29 02:00:42 np0005539504 kernel: tap6c0110c6-35: entered promiscuous mode
Nov 29 02:00:42 np0005539504 systemd-udevd[221951]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:42 np0005539504 systemd-udevd[221950]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00150|binding|INFO|Claiming lport cd85ddcc-46bd-4622-955e-12395bfecf41 for this chassis.
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00151|binding|INFO|cd85ddcc-46bd-4622-955e-12395bfecf41: Claiming fa:16:3e:a2:0f:31 10.100.0.154
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00152|if_status|INFO|Not updating pb chassis for 6c0110c6-3514-41dc-a70b-4131d855aeb3 now as sb is readonly
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6502] manager: (tapefebde6a-91): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Nov 29 02:00:42 np0005539504 systemd-udevd[221957]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6522] device (tap6c0110c6-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6533] device (tapcd85ddcc-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6551] device (tap6c0110c6-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6558] device (tapcd85ddcc-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.670 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:0f:31 10.100.0.154'], port_security=['fa:16:3e:a2:0f:31 10.100.0.154'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.154/24', 'neutron:device_id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c9e87c2-94a8-4e4d-bf2c-1e90654ac541, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=cd85ddcc-46bd-4622-955e-12395bfecf41) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.672 104164 INFO neutron.agent.ovn.metadata.agent [-] Port cd85ddcc-46bd-4622-955e-12395bfecf41 in datapath da6c5b7e-45ef-4b7f-8181-81e63563aadf bound to our chassis#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.673 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da6c5b7e-45ef-4b7f-8181-81e63563aadf#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00153|binding|INFO|Claiming lport 6c0110c6-3514-41dc-a70b-4131d855aeb3 for this chassis.
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00154|binding|INFO|6c0110c6-3514-41dc-a70b-4131d855aeb3: Claiming fa:16:3e:f9:6c:2b 10.100.1.70
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.688 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c921dcdb-8d02-4716-a39f-4020d6d07b77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.689 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapda6c5b7e-41 in ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:00:42 np0005539504 kernel: tapefebde6a-91: entered promiscuous mode
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6941] device (tapefebde6a-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.692 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapda6c5b7e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.693 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cb25cf6c-9ae2-45f8-8515-91ae8f156ef9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.694 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.694 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbd5132-8011-407c-a23b-00fa1c581d84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.6961] device (tapefebde6a-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00155|binding|INFO|Setting lport cd85ddcc-46bd-4622-955e-12395bfecf41 ovn-installed in OVS
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.701 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00156|binding|INFO|Claiming lport efebde6a-916e-4248-aa3d-459b872a6adb for this chassis.
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00157|binding|INFO|efebde6a-916e-4248-aa3d-459b872a6adb: Claiming fa:16:3e:48:3b:b8 10.100.0.85
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00158|binding|INFO|Setting lport cd85ddcc-46bd-4622-955e-12395bfecf41 up in Southbound
Nov 29 02:00:42 np0005539504 systemd-machined[153423]: New machine qemu-27-instance-00000033.
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.706 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:6c:2b 10.100.1.70'], port_security=['fa:16:3e:f9:6c:2b 10.100.1.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.70/24', 'neutron:device_id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12fba169-770d-497e-8585-0202b9a8b8d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227534c5-6410-4193-a2f5-6deb44b63d57, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6c0110c6-3514-41dc-a70b-4131d855aeb3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.710 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[7d2ccc56-a1fc-407f-9ee1-0dbd6357d699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.723 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:3b:b8 10.100.0.85'], port_security=['fa:16:3e:48:3b:b8 10.100.0.85'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.85/24', 'neutron:device_id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c9e87c2-94a8-4e4d-bf2c-1e90654ac541, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=efebde6a-916e-4248-aa3d-459b872a6adb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:42 np0005539504 systemd[1]: Started Virtual Machine qemu-27-instance-00000033.
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.728 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6375dde1-1776-433d-99c6-0a3dc30a4383]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00159|binding|INFO|Setting lport 6c0110c6-3514-41dc-a70b-4131d855aeb3 ovn-installed in OVS
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00160|binding|INFO|Setting lport 6c0110c6-3514-41dc-a70b-4131d855aeb3 up in Southbound
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.731 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00161|binding|INFO|Setting lport efebde6a-916e-4248-aa3d-459b872a6adb ovn-installed in OVS
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.740 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00162|binding|INFO|Setting lport efebde6a-916e-4248-aa3d-459b872a6adb up in Southbound
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.772 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f8b5c2-b555-4140-8764-e1ef28582ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.778 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c837602-4b78-4144-8c0f-f03e7af167bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.7792] manager: (tapda6c5b7e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.819 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[fe286011-2fee-467d-8210-da8fe234cb69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.823 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[58afe8e1-dede-4cc1-8e9f-3bb26430528a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.8455] device (tapda6c5b7e-40): carrier: link connected
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.850 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e237b6cb-1f19-4013-8e33-936a9c9af8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.869 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e3943d-ff06-411f-afc8-75a622283b3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda6c5b7e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:f3:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510572, 'reachable_time': 36070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221992, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.884 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1a03d32f-561a-406d-8e8d-bbe00bbc2a58]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3f:f3c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510572, 'tstamp': 510572}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221993, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.899 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd56e9f-977a-4599-84b0-c67129adfbda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda6c5b7e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:f3:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510572, 'reachable_time': 36070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221994, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.931 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e8affb36-3372-4e62-a878-c321c937ce6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.989 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[42a2e95f-85d7-4120-a8be-9d84b70caff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.991 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda6c5b7e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.992 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.992 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda6c5b7e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:42 np0005539504 NetworkManager[55210]: <info>  [1764399642.9945] manager: (tapda6c5b7e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 29 02:00:42 np0005539504 kernel: tapda6c5b7e-40: entered promiscuous mode
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.994 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:42.997 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda6c5b7e-40, col_values=(('external_ids', {'iface-id': 'c1ff3c68-35a1-4f6c-ae08-4f06a1ce6818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:42 np0005539504 nova_compute[187152]: 2025-11-29 07:00:42.998 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:42Z|00163|binding|INFO|Releasing lport c1ff3c68-35a1-4f6c-ae08-4f06a1ce6818 from this chassis (sb_readonly=0)
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.012 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.013 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da6c5b7e-45ef-4b7f-8181-81e63563aadf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da6c5b7e-45ef-4b7f-8181-81e63563aadf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.014 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6d7309-af73-49b9-ab23-68824c8d15fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.015 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-da6c5b7e-45ef-4b7f-8181-81e63563aadf
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/da6c5b7e-45ef-4b7f-8181-81e63563aadf.pid.haproxy
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID da6c5b7e-45ef-4b7f-8181-81e63563aadf
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.017 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'env', 'PROCESS_TAG=haproxy-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/da6c5b7e-45ef-4b7f-8181-81e63563aadf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.074 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399643.074065, 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.075 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] VM Started (Lifecycle Event)#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.103 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.107 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399643.0743625, 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.107 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.141 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.145 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.186 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.320 187156 DEBUG nova.compute.manager [req-56d4ffed-64bf-4f0a-9073-8a46cb918a2c req-699a417b-266a-439c-8f44-090898629680 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.321 187156 DEBUG oslo_concurrency.lockutils [req-56d4ffed-64bf-4f0a-9073-8a46cb918a2c req-699a417b-266a-439c-8f44-090898629680 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.321 187156 DEBUG oslo_concurrency.lockutils [req-56d4ffed-64bf-4f0a-9073-8a46cb918a2c req-699a417b-266a-439c-8f44-090898629680 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.322 187156 DEBUG oslo_concurrency.lockutils [req-56d4ffed-64bf-4f0a-9073-8a46cb918a2c req-699a417b-266a-439c-8f44-090898629680 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.322 187156 DEBUG nova.compute.manager [req-56d4ffed-64bf-4f0a-9073-8a46cb918a2c req-699a417b-266a-439c-8f44-090898629680 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Processing event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.402 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.401 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:43 np0005539504 podman[222033]: 2025-11-29 07:00:43.455822796 +0000 UTC m=+0.050845487 container create 1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:00:43 np0005539504 systemd[1]: Started libpod-conmon-1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8.scope.
Nov 29 02:00:43 np0005539504 podman[222033]: 2025-11-29 07:00:43.429815132 +0000 UTC m=+0.024837853 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:00:43 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:00:43 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37df919613365b87bab489d7b0efbb324c788945e343b7ccbc59de834bc455d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:00:43 np0005539504 podman[222033]: 2025-11-29 07:00:43.545015771 +0000 UTC m=+0.140038492 container init 1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:00:43 np0005539504 podman[222033]: 2025-11-29 07:00:43.550401867 +0000 UTC m=+0.145424558 container start 1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 02:00:43 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [NOTICE]   (222052) : New worker (222054) forked
Nov 29 02:00:43 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [NOTICE]   (222052) : Loading success.
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.617 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6c0110c6-3514-41dc-a70b-4131d855aeb3 in datapath 12fba169-770d-497e-8585-0202b9a8b8d2 unbound from our chassis#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.618 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 12fba169-770d-497e-8585-0202b9a8b8d2#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.628 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0d604bca-0818-4450-b9c8-99019a228e4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.629 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap12fba169-71 in ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.630 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap12fba169-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.630 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1727835a-ace7-4cef-9096-7f27f5d02976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.631 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[44b16142-d227-4366-aade-29eaebd9096f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.641 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[b396973f-22e7-4fce-8fee-8715e26d9537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.653 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ecef3e-5f23-46fa-b0bb-c93b3340111a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.682 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[84e82949-93b7-4ac9-8bea-768aacd9c9a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 NetworkManager[55210]: <info>  [1764399643.6894] manager: (tap12fba169-70): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.689 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9dc18a-d8e6-4086-9b37-6ba75381f05e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.720 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5bbaebe3-e758-4bf6-ad7e-3a34e98ec8f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.725 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[15d2b862-f9e7-4397-8839-5ec3b8914790]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 NetworkManager[55210]: <info>  [1764399643.7558] device (tap12fba169-70): carrier: link connected
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.760 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[aa98f457-c383-4265-8148-f25ca7d5a351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.778 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad94e31-810c-4b01-8b74-b8bd7a8a43c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap12fba169-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:44:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510663, 'reachable_time': 24517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222073, 'error': None, 'target': 'ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.791 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3833c55f-d69b-4a0f-a30c-719a3909cf67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:445b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510663, 'tstamp': 510663}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222074, 'error': None, 'target': 'ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.806 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c62d6413-7701-4854-b27c-63360f11504c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap12fba169-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:44:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510663, 'reachable_time': 24517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222075, 'error': None, 'target': 'ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.831 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cf881e0d-215f-4d92-9261-9680cfe85e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.890 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[314e07c5-e952-44b5-a586-5f11588e7008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.892 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12fba169-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.892 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.892 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12fba169-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539504 kernel: tap12fba169-70: entered promiscuous mode
Nov 29 02:00:43 np0005539504 NetworkManager[55210]: <info>  [1764399643.9285] manager: (tap12fba169-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.930 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.931 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap12fba169-70, col_values=(('external_ids', {'iface-id': '22d08e60-f74d-4752-95a0-59ec773b7d18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:43Z|00164|binding|INFO|Releasing lport 22d08e60-f74d-4752-95a0-59ec773b7d18 from this chassis (sb_readonly=0)
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.933 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12fba169-770d-497e-8585-0202b9a8b8d2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12fba169-770d-497e-8585-0202b9a8b8d2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.934 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[29e34a23-af04-4c16-84f8-5954abbb3bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.935 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-12fba169-770d-497e-8585-0202b9a8b8d2
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/12fba169-770d-497e-8585-0202b9a8b8d2.pid.haproxy
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 12fba169-770d-497e-8585-0202b9a8b8d2
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:00:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:43.935 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2', 'env', 'PROCESS_TAG=haproxy-12fba169-770d-497e-8585-0202b9a8b8d2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/12fba169-770d-497e-8585-0202b9a8b8d2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.955 187156 DEBUG nova.network.neutron [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updated VIF entry in instance network info cache for port efebde6a-916e-4248-aa3d-459b872a6adb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.955 187156 DEBUG nova.network.neutron [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updating instance_info_cache with network_info: [{"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:43 np0005539504 nova_compute[187152]: 2025-11-29 07:00:43.997 187156 DEBUG oslo_concurrency.lockutils [req-64df1421-c250-475d-ac06-9363976afe6d req-09750684-acf8-461c-a29c-da6acf088480 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.130 187156 DEBUG nova.compute.manager [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.130 187156 DEBUG oslo_concurrency.lockutils [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.131 187156 DEBUG oslo_concurrency.lockutils [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.131 187156 DEBUG oslo_concurrency.lockutils [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.131 187156 DEBUG nova.compute.manager [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Processing event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.324 187156 DEBUG nova.compute.manager [req-725593ce-8475-491c-8d5a-dfff23104483 req-d035098e-8d6f-473c-97e2-8eeeff3a35cd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.325 187156 DEBUG oslo_concurrency.lockutils [req-725593ce-8475-491c-8d5a-dfff23104483 req-d035098e-8d6f-473c-97e2-8eeeff3a35cd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.326 187156 DEBUG oslo_concurrency.lockutils [req-725593ce-8475-491c-8d5a-dfff23104483 req-d035098e-8d6f-473c-97e2-8eeeff3a35cd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.326 187156 DEBUG oslo_concurrency.lockutils [req-725593ce-8475-491c-8d5a-dfff23104483 req-d035098e-8d6f-473c-97e2-8eeeff3a35cd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.326 187156 DEBUG nova.compute.manager [req-725593ce-8475-491c-8d5a-dfff23104483 req-d035098e-8d6f-473c-97e2-8eeeff3a35cd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Processing event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.327 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.333 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399644.3326113, 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.334 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.335 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.345 187156 INFO nova.virt.libvirt.driver [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance spawned successfully.#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.346 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:00:44 np0005539504 podman[222107]: 2025-11-29 07:00:44.347559696 +0000 UTC m=+0.067282512 container create 3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.369 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.376 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.379 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.379 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.380 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.380 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.381 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.381 187156 DEBUG nova.virt.libvirt.driver [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:44 np0005539504 systemd[1]: Started libpod-conmon-3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4.scope.
Nov 29 02:00:44 np0005539504 podman[222107]: 2025-11-29 07:00:44.308995803 +0000 UTC m=+0.028718619 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:00:44 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:00:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62f92a57801a11078cd5368af46ce7d1ff57f39366f112de16e1e7f893797ee5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.423 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:44 np0005539504 podman[222107]: 2025-11-29 07:00:44.43597065 +0000 UTC m=+0.155693486 container init 3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:44 np0005539504 podman[222107]: 2025-11-29 07:00:44.444680516 +0000 UTC m=+0.164403332 container start 3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.462 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:44 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [NOTICE]   (222126) : New worker (222128) forked
Nov 29 02:00:44 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [NOTICE]   (222126) : Loading success.
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.505 187156 INFO nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Took 18.78 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.506 187156 DEBUG nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.543 104164 INFO neutron.agent.ovn.metadata.agent [-] Port efebde6a-916e-4248-aa3d-459b872a6adb in datapath da6c5b7e-45ef-4b7f-8181-81e63563aadf unbound from our chassis#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.544 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da6c5b7e-45ef-4b7f-8181-81e63563aadf#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.561 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[64f015e7-3dba-434a-b3a6-836b23aa9d22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.601 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ed854f9f-72ba-48fd-9025-682e29330c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.605 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[aac41af5-1cb0-4415-b6f6-1d3a15f0d287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.638 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c245fa57-108a-4d0f-bbff-a5d2f2684601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.653 187156 INFO nova.compute.manager [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Took 19.81 seconds to build instance.#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.657 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a974e2d6-4371-4264-8a08-bc3dacfd597c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda6c5b7e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:f3:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 5, 'rx_bytes': 266, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510572, 'reachable_time': 36070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222142, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.676 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2a55f1-bc01-4e0c-a79b-efaaefd9539a]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapda6c5b7e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510583, 'tstamp': 510583}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222143, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapda6c5b7e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510586, 'tstamp': 510586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222143, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.678 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda6c5b7e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.677 187156 DEBUG oslo_concurrency.lockutils [None req-25a3130a-c9df-4516-911a-c5133aef3fbe 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.679 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.681 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.682 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda6c5b7e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.682 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.683 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda6c5b7e-40, col_values=(('external_ids', {'iface-id': 'c1ff3c68-35a1-4f6c-ae08-4f06a1ce6818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.683 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:44.684 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:44 np0005539504 nova_compute[187152]: 2025-11-29 07:00:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:45 np0005539504 nova_compute[187152]: 2025-11-29 07:00:45.483 187156 DEBUG nova.compute.manager [req-8ce4d1b7-e055-4091-9cce-14fe366a7af2 req-25feea19-d380-42bd-ae73-e2f3c830c21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:45 np0005539504 nova_compute[187152]: 2025-11-29 07:00:45.484 187156 DEBUG oslo_concurrency.lockutils [req-8ce4d1b7-e055-4091-9cce-14fe366a7af2 req-25feea19-d380-42bd-ae73-e2f3c830c21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:45 np0005539504 nova_compute[187152]: 2025-11-29 07:00:45.484 187156 DEBUG oslo_concurrency.lockutils [req-8ce4d1b7-e055-4091-9cce-14fe366a7af2 req-25feea19-d380-42bd-ae73-e2f3c830c21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:45 np0005539504 nova_compute[187152]: 2025-11-29 07:00:45.484 187156 DEBUG oslo_concurrency.lockutils [req-8ce4d1b7-e055-4091-9cce-14fe366a7af2 req-25feea19-d380-42bd-ae73-e2f3c830c21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:45 np0005539504 nova_compute[187152]: 2025-11-29 07:00:45.485 187156 DEBUG nova.compute.manager [req-8ce4d1b7-e055-4091-9cce-14fe366a7af2 req-25feea19-d380-42bd-ae73-e2f3c830c21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:45 np0005539504 nova_compute[187152]: 2025-11-29 07:00:45.485 187156 WARNING nova.compute.manager [req-8ce4d1b7-e055-4091-9cce-14fe366a7af2 req-25feea19-d380-42bd-ae73-e2f3c830c21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received unexpected event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:00:45 np0005539504 podman[222144]: 2025-11-29 07:00:45.724354728 +0000 UTC m=+0.071561448 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.320 187156 DEBUG nova.compute.manager [req-c9a96c2e-d682-44b2-8a37-48e10d46c27b req-e5e3541f-ed1b-4b76-b959-02d833922307 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.320 187156 DEBUG oslo_concurrency.lockutils [req-c9a96c2e-d682-44b2-8a37-48e10d46c27b req-e5e3541f-ed1b-4b76-b959-02d833922307 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.320 187156 DEBUG oslo_concurrency.lockutils [req-c9a96c2e-d682-44b2-8a37-48e10d46c27b req-e5e3541f-ed1b-4b76-b959-02d833922307 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.320 187156 DEBUG oslo_concurrency.lockutils [req-c9a96c2e-d682-44b2-8a37-48e10d46c27b req-e5e3541f-ed1b-4b76-b959-02d833922307 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.321 187156 DEBUG nova.compute.manager [req-c9a96c2e-d682-44b2-8a37-48e10d46c27b req-e5e3541f-ed1b-4b76-b959-02d833922307 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.321 187156 WARNING nova.compute.manager [req-c9a96c2e-d682-44b2-8a37-48e10d46c27b req-e5e3541f-ed1b-4b76-b959-02d833922307 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received unexpected event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.503 187156 DEBUG nova.compute.manager [req-1e44d709-0cd2-440a-b413-e4722dfc41e4 req-6662d926-5507-4b3b-9ba6-37f001ad6f87 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.504 187156 DEBUG oslo_concurrency.lockutils [req-1e44d709-0cd2-440a-b413-e4722dfc41e4 req-6662d926-5507-4b3b-9ba6-37f001ad6f87 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.505 187156 DEBUG oslo_concurrency.lockutils [req-1e44d709-0cd2-440a-b413-e4722dfc41e4 req-6662d926-5507-4b3b-9ba6-37f001ad6f87 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.505 187156 DEBUG oslo_concurrency.lockutils [req-1e44d709-0cd2-440a-b413-e4722dfc41e4 req-6662d926-5507-4b3b-9ba6-37f001ad6f87 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.506 187156 DEBUG nova.compute.manager [req-1e44d709-0cd2-440a-b413-e4722dfc41e4 req-6662d926-5507-4b3b-9ba6-37f001ad6f87 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.506 187156 WARNING nova.compute.manager [req-1e44d709-0cd2-440a-b413-e4722dfc41e4 req-6662d926-5507-4b3b-9ba6-37f001ad6f87 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received unexpected event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb for instance with vm_state active and task_state None.#033[00m
Nov 29 02:00:46 np0005539504 nova_compute[187152]: 2025-11-29 07:00:46.617 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.539 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.541 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.541 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.541 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.542 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.561 187156 INFO nova.compute.manager [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Terminating instance#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.577 187156 DEBUG nova.compute.manager [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:00:47 np0005539504 kernel: tapcd85ddcc-46 (unregistering): left promiscuous mode
Nov 29 02:00:47 np0005539504 NetworkManager[55210]: <info>  [1764399647.6097] device (tapcd85ddcc-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.620 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00165|binding|INFO|Releasing lport cd85ddcc-46bd-4622-955e-12395bfecf41 from this chassis (sb_readonly=0)
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00166|binding|INFO|Setting lport cd85ddcc-46bd-4622-955e-12395bfecf41 down in Southbound
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00167|binding|INFO|Removing iface tapcd85ddcc-46 ovn-installed in OVS
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.623 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.634 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 kernel: tap6c0110c6-35 (unregistering): left promiscuous mode
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.636 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:0f:31 10.100.0.154'], port_security=['fa:16:3e:a2:0f:31 10.100.0.154'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.154/24', 'neutron:device_id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c9e87c2-94a8-4e4d-bf2c-1e90654ac541, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=cd85ddcc-46bd-4622-955e-12395bfecf41) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.638 104164 INFO neutron.agent.ovn.metadata.agent [-] Port cd85ddcc-46bd-4622-955e-12395bfecf41 in datapath da6c5b7e-45ef-4b7f-8181-81e63563aadf unbound from our chassis#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.640 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da6c5b7e-45ef-4b7f-8181-81e63563aadf#033[00m
Nov 29 02:00:47 np0005539504 NetworkManager[55210]: <info>  [1764399647.6418] device (tap6c0110c6-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00168|binding|INFO|Releasing lport 6c0110c6-3514-41dc-a70b-4131d855aeb3 from this chassis (sb_readonly=0)
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00169|binding|INFO|Setting lport 6c0110c6-3514-41dc-a70b-4131d855aeb3 down in Southbound
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00170|binding|INFO|Removing iface tap6c0110c6-35 ovn-installed in OVS
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.654 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.655 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cd58c062-ba8c-4670-bc60-b98c7e26b8dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.655 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.662 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:6c:2b 10.100.1.70'], port_security=['fa:16:3e:f9:6c:2b 10.100.1.70'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.70/24', 'neutron:device_id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12fba169-770d-497e-8585-0202b9a8b8d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=227534c5-6410-4193-a2f5-6deb44b63d57, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6c0110c6-3514-41dc-a70b-4131d855aeb3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:47 np0005539504 kernel: tapefebde6a-91 (unregistering): left promiscuous mode
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.666 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 NetworkManager[55210]: <info>  [1764399647.6699] device (tapefebde6a-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00171|binding|INFO|Releasing lport efebde6a-916e-4248-aa3d-459b872a6adb from this chassis (sb_readonly=0)
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00172|binding|INFO|Setting lport efebde6a-916e-4248-aa3d-459b872a6adb down in Southbound
Nov 29 02:00:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:00:47Z|00173|binding|INFO|Removing iface tapefebde6a-91 ovn-installed in OVS
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.685 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.688 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.699 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:3b:b8 10.100.0.85'], port_security=['fa:16:3e:48:3b:b8 10.100.0.85'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.85/24', 'neutron:device_id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c9e87c2-94a8-4e4d-bf2c-1e90654ac541, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=efebde6a-916e-4248-aa3d-459b872a6adb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.700 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.702 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d9759768-ceec-4ebf-bb6b-592b2ff6a34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.705 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ef09c760-ba2f-484e-a1eb-f7e3892a804b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.711 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000033.scope: Deactivated successfully.
Nov 29 02:00:47 np0005539504 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000033.scope: Consumed 3.490s CPU time.
Nov 29 02:00:47 np0005539504 systemd-machined[153423]: Machine qemu-27-instance-00000033 terminated.
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.744 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[46e12d30-6903-4334-9199-0413074dce56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.775 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c3c9e2-bed4-403e-9f73-3717ee7050e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda6c5b7e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:f3:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510572, 'reachable_time': 36070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222188, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.798 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa80bc7-47cc-4208-b050-e1e8171290fe]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapda6c5b7e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510583, 'tstamp': 510583}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222189, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapda6c5b7e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510586, 'tstamp': 510586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222189, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.800 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda6c5b7e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.803 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 NetworkManager[55210]: <info>  [1764399647.8047] manager: (tapcd85ddcc-46): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 29 02:00:47 np0005539504 NetworkManager[55210]: <info>  [1764399647.8165] manager: (tap6c0110c6-35): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Nov 29 02:00:47 np0005539504 NetworkManager[55210]: <info>  [1764399647.8284] manager: (tapefebde6a-91): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.832 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda6c5b7e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.831 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.834 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.834 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda6c5b7e-40, col_values=(('external_ids', {'iface-id': 'c1ff3c68-35a1-4f6c-ae08-4f06a1ce6818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.835 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.837 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6c0110c6-3514-41dc-a70b-4131d855aeb3 in datapath 12fba169-770d-497e-8585-0202b9a8b8d2 unbound from our chassis#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.839 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 12fba169-770d-497e-8585-0202b9a8b8d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.840 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1a770996-5b5e-44b4-be28-e392f95f77e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:47.842 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2 namespace which is not needed anymore#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.864 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.865 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.876 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.876 187156 INFO nova.compute.claims [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.883 187156 INFO nova.virt.libvirt.driver [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Instance destroyed successfully.#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.884 187156 DEBUG nova.objects.instance [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lazy-loading 'resources' on Instance uuid 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.950 187156 DEBUG nova.virt.libvirt.vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:00:44Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.951 187156 DEBUG nova.network.os_vif_util [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "cd85ddcc-46bd-4622-955e-12395bfecf41", "address": "fa:16:3e:a2:0f:31", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.154", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd85ddcc-46", "ovs_interfaceid": "cd85ddcc-46bd-4622-955e-12395bfecf41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.952 187156 DEBUG nova.network.os_vif_util [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.952 187156 DEBUG os_vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.954 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.954 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd85ddcc-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.958 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.962 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7f54d428-a3ac-4f2f-b5f6-fbaa715502e4', 'name': 'tempest-ServersTestMultiNic-server-214136367', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000033', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'user_id': '11e9982557a44d40b2ebaf04bf99c371', 'hostId': '66003949182de8a7f4207792c92c415985802320c983eee42e957d88', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.965 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.966 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.967 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.968 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.969 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.969 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.969 187156 INFO os_vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:0f:31,bridge_name='br-int',has_traffic_filtering=True,id=cd85ddcc-46bd-4622-955e-12395bfecf41,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd85ddcc-46')#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.970 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.971 187156 DEBUG nova.virt.libvirt.vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:00:44Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.971 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.971 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.971 187156 DEBUG nova.network.os_vif_util [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.972 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.972 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.972 187156 DEBUG nova.network.os_vif_util [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.972 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>]
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.973 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.973 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>]
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.973 187156 DEBUG os_vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.974 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.974 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.975 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.975 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.976 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c0110c6-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.976 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.977 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.981 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.981 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.982 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.982 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>]
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.984 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.985 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.986 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.986 187156 INFO os_vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:6c:2b,bridge_name='br-int',has_traffic_filtering=True,id=6c0110c6-3514-41dc-a70b-4131d855aeb3,network=Network(12fba169-770d-497e-8585-0202b9a8b8d2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c0110c6-35')#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.987 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.987 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.987 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestMultiNic-server-214136367>]
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.987 187156 DEBUG nova.virt.libvirt.vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-214136367',display_name='tempest-ServersTestMultiNic-server-214136367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-214136367',id=51,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-117s3dq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:00:44Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.987 187156 DEBUG nova.network.os_vif_util [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.988 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.988 187156 DEBUG nova.network.os_vif_util [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.989 187156 DEBUG os_vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.989 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.990 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.990 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefebde6a-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.991 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.992 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:00:47 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [NOTICE]   (222126) : haproxy version is 2.8.14-c23fe91
Nov 29 02:00:47 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [NOTICE]   (222126) : path to executable is /usr/sbin/haproxy
Nov 29 02:00:47 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [WARNING]  (222126) : Exiting Master process...
Nov 29 02:00:47 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [WARNING]  (222126) : Exiting Master process...
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.994 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.994 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:00:47.995 12 DEBUG ceilometer.compute.pollsters [-] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000033, id=7f54d428-a3ac-4f2f-b5f6-fbaa715502e4>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:00:47 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [ALERT]    (222126) : Current worker (222128) exited with code 143 (Terminated)
Nov 29 02:00:47 np0005539504 neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2[222122]: [WARNING]  (222126) : All workers exited. Exiting... (0)
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.996 187156 INFO os_vif [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:3b:b8,bridge_name='br-int',has_traffic_filtering=True,id=efebde6a-916e-4248-aa3d-459b872a6adb,network=Network(da6c5b7e-45ef-4b7f-8181-81e63563aadf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefebde6a-91')#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.997 187156 INFO nova.virt.libvirt.driver [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Deleting instance files /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4_del#033[00m
Nov 29 02:00:47 np0005539504 nova_compute[187152]: 2025-11-29 07:00:47.997 187156 INFO nova.virt.libvirt.driver [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Deletion of /var/lib/nova/instances/7f54d428-a3ac-4f2f-b5f6-fbaa715502e4_del complete#033[00m
Nov 29 02:00:48 np0005539504 systemd[1]: libpod-3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4.scope: Deactivated successfully.
Nov 29 02:00:48 np0005539504 podman[222242]: 2025-11-29 07:00:48.0047429 +0000 UTC m=+0.050875308 container died 3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay-62f92a57801a11078cd5368af46ce7d1ff57f39366f112de16e1e7f893797ee5-merged.mount: Deactivated successfully.
Nov 29 02:00:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4-userdata-shm.mount: Deactivated successfully.
Nov 29 02:00:48 np0005539504 podman[222242]: 2025-11-29 07:00:48.043250273 +0000 UTC m=+0.089382671 container cleanup 3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:00:48 np0005539504 systemd[1]: libpod-conmon-3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4.scope: Deactivated successfully.
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.108 187156 INFO nova.compute.manager [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.109 187156 DEBUG oslo.service.loopingcall [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.109 187156 DEBUG nova.compute.manager [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.109 187156 DEBUG nova.network.neutron [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:00:48 np0005539504 podman[222274]: 2025-11-29 07:00:48.115891469 +0000 UTC m=+0.049499671 container remove 3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.120 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[36609249-9916-4c18-931b-1c90ab5c9dfb]: (4, ('Sat Nov 29 07:00:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2 (3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4)\n3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4\nSat Nov 29 07:00:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2 (3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4)\n3a999ac11ad62fdc0357239706870df8061a394db70989fafc09d9a0204d6ec4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.122 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f3615c9e-625b-4798-9808-d2ad086b6d91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.122 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12fba169-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:48 np0005539504 kernel: tap12fba169-70: left promiscuous mode
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.125 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.136 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.137 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.140 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[07acb84e-4214-4104-899a-b8c9e744eaf3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.164 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f515a9a8-df9e-46be-9dea-8de0ab4274d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.165 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[978fb51c-0a64-4b5d-be2b-f9f0caa6f4a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.185 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[42e7efb3-309e-4ba4-b5b9-4bba853a7c09]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510655, 'reachable_time': 42601, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222289, 'error': None, 'target': 'ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.188 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-12fba169-770d-497e-8585-0202b9a8b8d2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.188 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[481a2ddf-4193-4d33-a9ca-50c1579a220d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.189 104164 INFO neutron.agent.ovn.metadata.agent [-] Port efebde6a-916e-4248-aa3d-459b872a6adb in datapath da6c5b7e-45ef-4b7f-8181-81e63563aadf unbound from our chassis#033[00m
Nov 29 02:00:48 np0005539504 systemd[1]: run-netns-ovnmeta\x2d12fba169\x2d770d\x2d497e\x2d8585\x2d0202b9a8b8d2.mount: Deactivated successfully.
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.191 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da6c5b7e-45ef-4b7f-8181-81e63563aadf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.191 187156 DEBUG nova.compute.provider_tree [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.192 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[435583ec-7eb7-4995-a648-b3d990695978]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.192 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf namespace which is not needed anymore#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.234 187156 DEBUG nova.scheduler.client.report [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.269 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.269 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:00:48 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [NOTICE]   (222052) : haproxy version is 2.8.14-c23fe91
Nov 29 02:00:48 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [NOTICE]   (222052) : path to executable is /usr/sbin/haproxy
Nov 29 02:00:48 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [WARNING]  (222052) : Exiting Master process...
Nov 29 02:00:48 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [ALERT]    (222052) : Current worker (222054) exited with code 143 (Terminated)
Nov 29 02:00:48 np0005539504 neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf[222048]: [WARNING]  (222052) : All workers exited. Exiting... (0)
Nov 29 02:00:48 np0005539504 systemd[1]: libpod-1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8.scope: Deactivated successfully.
Nov 29 02:00:48 np0005539504 podman[222307]: 2025-11-29 07:00:48.339811921 +0000 UTC m=+0.049028858 container died 1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:00:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8-userdata-shm.mount: Deactivated successfully.
Nov 29 02:00:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay-37df919613365b87bab489d7b0efbb324c788945e343b7ccbc59de834bc455d2-merged.mount: Deactivated successfully.
Nov 29 02:00:48 np0005539504 podman[222307]: 2025-11-29 07:00:48.376190716 +0000 UTC m=+0.085407643 container cleanup 1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.378 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.378 187156 DEBUG nova.network.neutron [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:00:48 np0005539504 systemd[1]: libpod-conmon-1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8.scope: Deactivated successfully.
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.410 187156 INFO nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.443 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:00:48 np0005539504 podman[222333]: 2025-11-29 07:00:48.44615762 +0000 UTC m=+0.046902591 container remove 1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.453 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9170f9aa-be4d-4a94-b95b-b291621b4c16]: (4, ('Sat Nov 29 07:00:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf (1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8)\n1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8\nSat Nov 29 07:00:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf (1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8)\n1d3c23da2080765e44971e2bd0c690e04b08e98edb53dadce03dbc6190755cb8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.456 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8e396137-644e-4a17-baf6-aeb8e125bd65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.457 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda6c5b7e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:48 np0005539504 kernel: tapda6c5b7e-40: left promiscuous mode
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.459 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.470 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.473 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e15d24c2-5497-499f-bf43-ee504c134f5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.487 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4d1048-b776-49e2-b4d6-865906c9821e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.488 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bb23e8c0-c9ec-4414-ab28-59dd6cb5152d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.505 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[01dee72f-aa90-4f24-b04d-c0a4009eeda6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510564, 'reachable_time': 19244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222348, 'error': None, 'target': 'ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.508 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-da6c5b7e-45ef-4b7f-8181-81e63563aadf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:00:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:48.508 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[39d91370-8c24-4ac9-b82c-5c2924261178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.648 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.650 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.650 187156 INFO nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Creating image(s)#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.651 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "/var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.651 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.652 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.666 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.733 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.734 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.734 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.746 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.817 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.818 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.869 187156 DEBUG nova.compute.manager [req-99987c95-bab5-4269-8c77-249d9724d891 req-28049479-085d-4655-bccf-c7cc256c1d22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-unplugged-cd85ddcc-46bd-4622-955e-12395bfecf41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.870 187156 DEBUG oslo_concurrency.lockutils [req-99987c95-bab5-4269-8c77-249d9724d891 req-28049479-085d-4655-bccf-c7cc256c1d22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.870 187156 DEBUG oslo_concurrency.lockutils [req-99987c95-bab5-4269-8c77-249d9724d891 req-28049479-085d-4655-bccf-c7cc256c1d22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.871 187156 DEBUG oslo_concurrency.lockutils [req-99987c95-bab5-4269-8c77-249d9724d891 req-28049479-085d-4655-bccf-c7cc256c1d22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.871 187156 DEBUG nova.compute.manager [req-99987c95-bab5-4269-8c77-249d9724d891 req-28049479-085d-4655-bccf-c7cc256c1d22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-unplugged-cd85ddcc-46bd-4622-955e-12395bfecf41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.871 187156 DEBUG nova.compute.manager [req-99987c95-bab5-4269-8c77-249d9724d891 req-28049479-085d-4655-bccf-c7cc256c1d22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-unplugged-cd85ddcc-46bd-4622-955e-12395bfecf41 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.873 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.873 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.874 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.941 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.942 187156 DEBUG nova.virt.disk.api [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Checking if we can resize image /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.942 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.960 187156 DEBUG nova.policy [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.976 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.976 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 02:00:48 np0005539504 nova_compute[187152]: 2025-11-29 07:00:48.976 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.002 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.003 187156 DEBUG nova.virt.disk.api [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Cannot resize image /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.003 187156 DEBUG nova.objects.instance [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'migration_context' on Instance uuid 63f8497a-eaf6-45ec-a251-92e7903aa297 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.026 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.026 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Ensure instance console log exists: /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.027 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.027 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.027 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:49 np0005539504 systemd[1]: run-netns-ovnmeta\x2dda6c5b7e\x2d45ef\x2d4b7f\x2d8181\x2d81e63563aadf.mount: Deactivated successfully.
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.082 187156 DEBUG nova.compute.manager [req-295ac35c-cd76-4c30-bb74-89345b49e4ea req-dd0116bf-bc92-49a3-9e93-f876c7846b9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-unplugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.083 187156 DEBUG oslo_concurrency.lockutils [req-295ac35c-cd76-4c30-bb74-89345b49e4ea req-dd0116bf-bc92-49a3-9e93-f876c7846b9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.083 187156 DEBUG oslo_concurrency.lockutils [req-295ac35c-cd76-4c30-bb74-89345b49e4ea req-dd0116bf-bc92-49a3-9e93-f876c7846b9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.084 187156 DEBUG oslo_concurrency.lockutils [req-295ac35c-cd76-4c30-bb74-89345b49e4ea req-dd0116bf-bc92-49a3-9e93-f876c7846b9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.084 187156 DEBUG nova.compute.manager [req-295ac35c-cd76-4c30-bb74-89345b49e4ea req-dd0116bf-bc92-49a3-9e93-f876c7846b9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-unplugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.085 187156 DEBUG nova.compute.manager [req-295ac35c-cd76-4c30-bb74-89345b49e4ea req-dd0116bf-bc92-49a3-9e93-f876c7846b9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-unplugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.464 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:49 np0005539504 nova_compute[187152]: 2025-11-29 07:00:49.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:00:50 np0005539504 podman[222364]: 2025-11-29 07:00:50.732308538 +0000 UTC m=+0.068696570 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:00:50 np0005539504 nova_compute[187152]: 2025-11-29 07:00:50.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.013 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.015 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.015 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.015 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.216 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.218 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5724MB free_disk=73.20082473754883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.218 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.219 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.496 187156 DEBUG nova.compute.manager [req-c642f816-78ed-450d-9c0a-7d1e14b81232 req-5b8f7a07-988c-4509-b160-9a059f011358 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.497 187156 DEBUG oslo_concurrency.lockutils [req-c642f816-78ed-450d-9c0a-7d1e14b81232 req-5b8f7a07-988c-4509-b160-9a059f011358 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.497 187156 DEBUG oslo_concurrency.lockutils [req-c642f816-78ed-450d-9c0a-7d1e14b81232 req-5b8f7a07-988c-4509-b160-9a059f011358 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.497 187156 DEBUG oslo_concurrency.lockutils [req-c642f816-78ed-450d-9c0a-7d1e14b81232 req-5b8f7a07-988c-4509-b160-9a059f011358 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.498 187156 DEBUG nova.compute.manager [req-c642f816-78ed-450d-9c0a-7d1e14b81232 req-5b8f7a07-988c-4509-b160-9a059f011358 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.498 187156 WARNING nova.compute.manager [req-c642f816-78ed-450d-9c0a-7d1e14b81232 req-5b8f7a07-988c-4509-b160-9a059f011358 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received unexpected event network-vif-plugged-6c0110c6-3514-41dc-a70b-4131d855aeb3 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.505 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.506 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 63f8497a-eaf6-45ec-a251-92e7903aa297 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.506 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.506 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.556 187156 DEBUG nova.compute.manager [req-438274d9-4165-4d8a-aba6-f93dfa598dd7 req-621ad94d-2d1b-410a-8e94-4723928ed84b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.557 187156 DEBUG oslo_concurrency.lockutils [req-438274d9-4165-4d8a-aba6-f93dfa598dd7 req-621ad94d-2d1b-410a-8e94-4723928ed84b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.557 187156 DEBUG oslo_concurrency.lockutils [req-438274d9-4165-4d8a-aba6-f93dfa598dd7 req-621ad94d-2d1b-410a-8e94-4723928ed84b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.558 187156 DEBUG oslo_concurrency.lockutils [req-438274d9-4165-4d8a-aba6-f93dfa598dd7 req-621ad94d-2d1b-410a-8e94-4723928ed84b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.558 187156 DEBUG nova.compute.manager [req-438274d9-4165-4d8a-aba6-f93dfa598dd7 req-621ad94d-2d1b-410a-8e94-4723928ed84b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.558 187156 WARNING nova.compute.manager [req-438274d9-4165-4d8a-aba6-f93dfa598dd7 req-621ad94d-2d1b-410a-8e94-4723928ed84b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received unexpected event network-vif-plugged-cd85ddcc-46bd-4622-955e-12395bfecf41 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.685 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:00:51.688 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.724 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.787 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:00:51 np0005539504 nova_compute[187152]: 2025-11-29 07:00:51.788 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.862 187156 DEBUG nova.compute.manager [req-fd5ef518-ef2d-436d-9354-19e7e431abce req-c87048f7-f35c-4d8f-8476-9a553cf8c8e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-unplugged-efebde6a-916e-4248-aa3d-459b872a6adb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.863 187156 DEBUG oslo_concurrency.lockutils [req-fd5ef518-ef2d-436d-9354-19e7e431abce req-c87048f7-f35c-4d8f-8476-9a553cf8c8e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.863 187156 DEBUG oslo_concurrency.lockutils [req-fd5ef518-ef2d-436d-9354-19e7e431abce req-c87048f7-f35c-4d8f-8476-9a553cf8c8e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.863 187156 DEBUG oslo_concurrency.lockutils [req-fd5ef518-ef2d-436d-9354-19e7e431abce req-c87048f7-f35c-4d8f-8476-9a553cf8c8e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.863 187156 DEBUG nova.compute.manager [req-fd5ef518-ef2d-436d-9354-19e7e431abce req-c87048f7-f35c-4d8f-8476-9a553cf8c8e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-unplugged-efebde6a-916e-4248-aa3d-459b872a6adb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.864 187156 DEBUG nova.compute.manager [req-fd5ef518-ef2d-436d-9354-19e7e431abce req-c87048f7-f35c-4d8f-8476-9a553cf8c8e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-unplugged-efebde6a-916e-4248-aa3d-459b872a6adb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:00:52 np0005539504 nova_compute[187152]: 2025-11-29 07:00:52.994 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:53 np0005539504 nova_compute[187152]: 2025-11-29 07:00:53.778 187156 DEBUG nova.network.neutron [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Successfully created port: 6f4282c7-128e-4a36-ac72-16c3431d2be7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:00:54 np0005539504 nova_compute[187152]: 2025-11-29 07:00:54.508 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:54 np0005539504 nova_compute[187152]: 2025-11-29 07:00:54.789 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.333 187156 DEBUG nova.compute.manager [req-f87d8e97-2efd-44f3-b953-1938b822f902 req-66d29a5e-81eb-4b82-891a-f5474d20acc0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.333 187156 DEBUG oslo_concurrency.lockutils [req-f87d8e97-2efd-44f3-b953-1938b822f902 req-66d29a5e-81eb-4b82-891a-f5474d20acc0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.334 187156 DEBUG oslo_concurrency.lockutils [req-f87d8e97-2efd-44f3-b953-1938b822f902 req-66d29a5e-81eb-4b82-891a-f5474d20acc0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.334 187156 DEBUG oslo_concurrency.lockutils [req-f87d8e97-2efd-44f3-b953-1938b822f902 req-66d29a5e-81eb-4b82-891a-f5474d20acc0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.334 187156 DEBUG nova.compute.manager [req-f87d8e97-2efd-44f3-b953-1938b822f902 req-66d29a5e-81eb-4b82-891a-f5474d20acc0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] No waiting events found dispatching network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.334 187156 WARNING nova.compute.manager [req-f87d8e97-2efd-44f3-b953-1938b822f902 req-66d29a5e-81eb-4b82-891a-f5474d20acc0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received unexpected event network-vif-plugged-efebde6a-916e-4248-aa3d-459b872a6adb for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.399 187156 DEBUG nova.compute.manager [req-89c504c6-113e-4ee3-a407-faa9b8a63db6 req-0284e3de-7dbc-4308-9ef3-9b0757c02fc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-deleted-cd85ddcc-46bd-4622-955e-12395bfecf41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.400 187156 INFO nova.compute.manager [req-89c504c6-113e-4ee3-a407-faa9b8a63db6 req-0284e3de-7dbc-4308-9ef3-9b0757c02fc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Neutron deleted interface cd85ddcc-46bd-4622-955e-12395bfecf41; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.400 187156 DEBUG nova.network.neutron [req-89c504c6-113e-4ee3-a407-faa9b8a63db6 req-0284e3de-7dbc-4308-9ef3-9b0757c02fc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updating instance_info_cache with network_info: [{"id": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "address": "fa:16:3e:f9:6c:2b", "network": {"id": "12fba169-770d-497e-8585-0202b9a8b8d2", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1741447266", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c0110c6-35", "ovs_interfaceid": "6c0110c6-3514-41dc-a70b-4131d855aeb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "efebde6a-916e-4248-aa3d-459b872a6adb", "address": "fa:16:3e:48:3b:b8", "network": {"id": "da6c5b7e-45ef-4b7f-8181-81e63563aadf", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1648696827", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefebde6a-91", "ovs_interfaceid": "efebde6a-916e-4248-aa3d-459b872a6adb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:55 np0005539504 nova_compute[187152]: 2025-11-29 07:00:55.430 187156 DEBUG nova.compute.manager [req-89c504c6-113e-4ee3-a407-faa9b8a63db6 req-0284e3de-7dbc-4308-9ef3-9b0757c02fc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Detach interface failed, port_id=cd85ddcc-46bd-4622-955e-12395bfecf41, reason: Instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:00:56 np0005539504 nova_compute[187152]: 2025-11-29 07:00:56.919 187156 DEBUG nova.network.neutron [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:56 np0005539504 nova_compute[187152]: 2025-11-29 07:00:56.977 187156 INFO nova.compute.manager [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Took 8.87 seconds to deallocate network for instance.#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.098 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.099 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.226 187156 DEBUG nova.compute.provider_tree [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.255 187156 DEBUG nova.scheduler.client.report [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.284 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.327 187156 INFO nova.scheduler.client.report [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Deleted allocations for instance 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.520 187156 DEBUG oslo_concurrency.lockutils [None req-3fde9bf2-3ff9-4585-a935-cf91a42b684d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "7f54d428-a3ac-4f2f-b5f6-fbaa715502e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.689 187156 DEBUG nova.compute.manager [req-6a27d3bd-7344-4c72-b3eb-9964861350b5 req-44ad2eb1-9faa-4eee-8cbb-55f7c92eb176 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-deleted-6c0110c6-3514-41dc-a70b-4131d855aeb3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.689 187156 DEBUG nova.compute.manager [req-6a27d3bd-7344-4c72-b3eb-9964861350b5 req-44ad2eb1-9faa-4eee-8cbb-55f7c92eb176 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Received event network-vif-deleted-efebde6a-916e-4248-aa3d-459b872a6adb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.710 187156 DEBUG nova.network.neutron [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Successfully updated port: 6f4282c7-128e-4a36-ac72-16c3431d2be7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.724 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.724 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquired lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.725 187156 DEBUG nova.network.neutron [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:00:57 np0005539504 nova_compute[187152]: 2025-11-29 07:00:57.996 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:58 np0005539504 nova_compute[187152]: 2025-11-29 07:00:58.088 187156 DEBUG nova.network.neutron [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.511 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.815 187156 DEBUG nova.network.neutron [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Updating instance_info_cache with network_info: [{"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.844 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Releasing lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.845 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Instance network_info: |[{"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.849 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Start _get_guest_xml network_info=[{"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.854 187156 DEBUG nova.compute.manager [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-changed-6f4282c7-128e-4a36-ac72-16c3431d2be7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.855 187156 DEBUG nova.compute.manager [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Refreshing instance network info cache due to event network-changed-6f4282c7-128e-4a36-ac72-16c3431d2be7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.855 187156 DEBUG oslo_concurrency.lockutils [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.855 187156 DEBUG oslo_concurrency.lockutils [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.856 187156 DEBUG nova.network.neutron [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Refreshing network info cache for port 6f4282c7-128e-4a36-ac72-16c3431d2be7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.862 187156 WARNING nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.867 187156 DEBUG nova.virt.libvirt.host [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.868 187156 DEBUG nova.virt.libvirt.host [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.871 187156 DEBUG nova.virt.libvirt.host [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.872 187156 DEBUG nova.virt.libvirt.host [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.873 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.874 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.874 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.874 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.875 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.875 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.875 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.875 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.876 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.876 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.876 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.876 187156 DEBUG nova.virt.hardware [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.881 187156 DEBUG nova.virt.libvirt.vif [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-684406623',display_name='tempest-ServersAdminTestJSON-server-684406623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-684406623',id=54,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-8t85hl1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:48Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=63f8497a-eaf6-45ec-a251-92e7903aa297,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.882 187156 DEBUG nova.network.os_vif_util [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.883 187156 DEBUG nova.network.os_vif_util [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.884 187156 DEBUG nova.objects.instance [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63f8497a-eaf6-45ec-a251-92e7903aa297 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.918 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <uuid>63f8497a-eaf6-45ec-a251-92e7903aa297</uuid>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <name>instance-00000036</name>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAdminTestJSON-server-684406623</nova:name>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:00:59</nova:creationTime>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:user uuid="cd616d4c2eb44fe0a0da2df1690c0e21">tempest-ServersAdminTestJSON-1087744064-project-member</nova:user>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:project uuid="80b4126e17a14d73b40158a57f19d091">tempest-ServersAdminTestJSON-1087744064</nova:project>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        <nova:port uuid="6f4282c7-128e-4a36-ac72-16c3431d2be7">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <entry name="serial">63f8497a-eaf6-45ec-a251-92e7903aa297</entry>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <entry name="uuid">63f8497a-eaf6-45ec-a251-92e7903aa297</entry>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.config"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:6e:5f:f9"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <target dev="tap6f4282c7-12"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/console.log" append="off"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:00:59 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:00:59 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:00:59 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:00:59 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.919 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Preparing to wait for external event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.919 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.919 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.919 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.921 187156 DEBUG nova.virt.libvirt.vif [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-684406623',display_name='tempest-ServersAdminTestJSON-server-684406623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-684406623',id=54,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-8t85hl1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:48Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=63f8497a-eaf6-45ec-a251-92e7903aa297,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.921 187156 DEBUG nova.network.os_vif_util [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.922 187156 DEBUG nova.network.os_vif_util [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.923 187156 DEBUG os_vif [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.923 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.924 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.924 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.927 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f4282c7-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.928 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6f4282c7-12, col_values=(('external_ids', {'iface-id': '6f4282c7-128e-4a36-ac72-16c3431d2be7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:5f:f9', 'vm-uuid': '63f8497a-eaf6-45ec-a251-92e7903aa297'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.930 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:59 np0005539504 NetworkManager[55210]: <info>  [1764399659.9314] manager: (tap6f4282c7-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.939 187156 INFO os_vif [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12')#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.991 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.992 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.992 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No VIF found with MAC fa:16:3e:6e:5f:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:59 np0005539504 nova_compute[187152]: 2025-11-29 07:00:59.993 187156 INFO nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Using config drive#033[00m
Nov 29 02:01:00 np0005539504 nova_compute[187152]: 2025-11-29 07:01:00.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:00 np0005539504 podman[222389]: 2025-11-29 07:01:00.730581922 +0000 UTC m=+0.063330975 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.076 187156 INFO nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Creating config drive at /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.config#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.082 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwbfjr249 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.207 187156 DEBUG oslo_concurrency.processutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwbfjr249" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:01 np0005539504 kernel: tap6f4282c7-12: entered promiscuous mode
Nov 29 02:01:01 np0005539504 NetworkManager[55210]: <info>  [1764399661.2721] manager: (tap6f4282c7-12): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Nov 29 02:01:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:01Z|00174|binding|INFO|Claiming lport 6f4282c7-128e-4a36-ac72-16c3431d2be7 for this chassis.
Nov 29 02:01:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:01Z|00175|binding|INFO|6f4282c7-128e-4a36-ac72-16c3431d2be7: Claiming fa:16:3e:6e:5f:f9 10.100.0.5
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.274 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.300 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:5f:f9 10.100.0.5'], port_security=['fa:16:3e:6e:5f:f9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '2', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6f4282c7-128e-4a36-ac72-16c3431d2be7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:01 np0005539504 systemd-udevd[222425]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.304 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6f4282c7-128e-4a36-ac72-16c3431d2be7 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.305 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.317 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50af9ece-e173-4418-a211-63de691af3b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.318 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97f3d85-11 in ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.320 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97f3d85-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.320 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa38194-fa40-4165-bced-097b9b64e04d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.321 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8a200846-c775-449a-8547-4396f6b08668]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 NetworkManager[55210]: <info>  [1764399661.3259] device (tap6f4282c7-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:01:01 np0005539504 NetworkManager[55210]: <info>  [1764399661.3271] device (tap6f4282c7-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:01:01 np0005539504 systemd-machined[153423]: New machine qemu-28-instance-00000036.
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.337 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 systemd[1]: Started Virtual Machine qemu-28-instance-00000036.
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.337 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa9ee39-54e3-4921-aa5b-1cc1ffaba61f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:01Z|00176|binding|INFO|Setting lport 6f4282c7-128e-4a36-ac72-16c3431d2be7 ovn-installed in OVS
Nov 29 02:01:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:01Z|00177|binding|INFO|Setting lport 6f4282c7-128e-4a36-ac72-16c3431d2be7 up in Southbound
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.368 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5a938e-7593-4993-8d16-287d1415ce7c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.400 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f52f0b05-29d8-42be-9f94-b302606c6c03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.409 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[71d72ca1-de42-4fb1-a4b0-91894bf37347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 NetworkManager[55210]: <info>  [1764399661.4109] manager: (tapb97f3d85-10): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.443 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[395b0b32-7b2b-409d-bd75-668c05d5c26e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.446 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0d9d0d-d555-4092-be5b-1487848eca85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 NetworkManager[55210]: <info>  [1764399661.4702] device (tapb97f3d85-10): carrier: link connected
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.476 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[02f656d7-2781-4c80-92b7-ec11211c2897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.493 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eae7e9be-dccd-495e-bc0a-8a7f3f4dad3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512435, 'reachable_time': 28877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222460, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.508 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[421e5b89-0b79-4eb2-94d6-aa0a7bc5baf6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe53:e22d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512435, 'tstamp': 512435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222461, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.525 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1012ed39-0056-4776-a913-6e068304d73b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512435, 'reachable_time': 28877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222462, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.558 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3db14de2-a10c-4a15-8a30-a98358c80091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.632 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[183e8b5b-f682-4b7b-9f27-ab85dd24c9ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.635 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.635 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.635 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.637 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 NetworkManager[55210]: <info>  [1764399661.6384] manager: (tapb97f3d85-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 29 02:01:01 np0005539504 kernel: tapb97f3d85-10: entered promiscuous mode
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.642 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.643 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:01Z|00178|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.645 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97f3d85-11c0-4475-aea6-e8da158df42a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97f3d85-11c0-4475-aea6-e8da158df42a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.648 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[da6036ce-029f-40bd-9791-4ba4f3587db1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.652 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-b97f3d85-11c0-4475-aea6-e8da158df42a
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/b97f3d85-11c0-4475-aea6-e8da158df42a.pid.haproxy
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID b97f3d85-11c0-4475-aea6-e8da158df42a
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:01.653 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'env', 'PROCESS_TAG=haproxy-b97f3d85-11c0-4475-aea6-e8da158df42a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97f3d85-11c0-4475-aea6-e8da158df42a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.678 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399661.6774614, 63f8497a-eaf6-45ec-a251-92e7903aa297 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.679 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] VM Started (Lifecycle Event)#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.707 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.713 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399661.6777077, 63f8497a-eaf6-45ec-a251-92e7903aa297 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.713 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.745 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.748 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.774 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.832 187156 DEBUG nova.compute.manager [req-519849f3-0a29-4a9c-8e95-0cb193afe23e req-e3502e2d-9822-466e-9659-54911d784098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.833 187156 DEBUG oslo_concurrency.lockutils [req-519849f3-0a29-4a9c-8e95-0cb193afe23e req-e3502e2d-9822-466e-9659-54911d784098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.833 187156 DEBUG oslo_concurrency.lockutils [req-519849f3-0a29-4a9c-8e95-0cb193afe23e req-e3502e2d-9822-466e-9659-54911d784098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.833 187156 DEBUG oslo_concurrency.lockutils [req-519849f3-0a29-4a9c-8e95-0cb193afe23e req-e3502e2d-9822-466e-9659-54911d784098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.833 187156 DEBUG nova.compute.manager [req-519849f3-0a29-4a9c-8e95-0cb193afe23e req-e3502e2d-9822-466e-9659-54911d784098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Processing event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.834 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.837 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399661.8377028, 63f8497a-eaf6-45ec-a251-92e7903aa297 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.838 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.839 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.843 187156 INFO nova.virt.libvirt.driver [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Instance spawned successfully.#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.843 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.872 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.880 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.886 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.886 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.887 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.887 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.888 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.888 187156 DEBUG nova.virt.libvirt.driver [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:01 np0005539504 nova_compute[187152]: 2025-11-29 07:01:01.988 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.055 187156 INFO nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Took 13.41 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.055 187156 DEBUG nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:02 np0005539504 podman[222512]: 2025-11-29 07:01:02.089972093 +0000 UTC m=+0.073702926 container create 82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:01:02 np0005539504 systemd[1]: Started libpod-conmon-82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f.scope.
Nov 29 02:01:02 np0005539504 podman[222512]: 2025-11-29 07:01:02.047186385 +0000 UTC m=+0.030917238 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:01:02 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:01:02 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b9e7b4fd292f341fc731243a6fb28ddf112d3f5fde438d5c4753e5efe113521/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:01:02 np0005539504 podman[222512]: 2025-11-29 07:01:02.184566294 +0000 UTC m=+0.168297147 container init 82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:01:02 np0005539504 podman[222512]: 2025-11-29 07:01:02.190923545 +0000 UTC m=+0.174654378 container start 82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.198 187156 INFO nova.compute.manager [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Took 14.39 seconds to build instance.#033[00m
Nov 29 02:01:02 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [NOTICE]   (222531) : New worker (222533) forked
Nov 29 02:01:02 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [NOTICE]   (222531) : Loading success.
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.233 187156 DEBUG oslo_concurrency.lockutils [None req-3a1409ec-448c-4b1e-b07d-6a88811cbbb1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.790 187156 DEBUG nova.network.neutron [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Updated VIF entry in instance network info cache for port 6f4282c7-128e-4a36-ac72-16c3431d2be7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.790 187156 DEBUG nova.network.neutron [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Updating instance_info_cache with network_info: [{"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.879 187156 DEBUG oslo_concurrency.lockutils [req-ed5b4fdf-9bd6-4e01-ae54-58b71a615881 req-0c0a8f0c-83cd-48c3-bcab-ca9eb0725bde 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.880 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399647.879564, 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.880 187156 INFO nova.compute.manager [-] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:01:02 np0005539504 nova_compute[187152]: 2025-11-29 07:01:02.938 187156 DEBUG nova.compute.manager [None req-a4014637-4c26-4e4b-9d9b-f67f739bb627 - - - - - -] [instance: 7f54d428-a3ac-4f2f-b5f6-fbaa715502e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.199 187156 DEBUG nova.compute.manager [req-186c9bb1-3a30-4a30-903c-0f07d4e6c8c1 req-30f09e47-c685-4abe-9319-8ec271ba4805 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.200 187156 DEBUG oslo_concurrency.lockutils [req-186c9bb1-3a30-4a30-903c-0f07d4e6c8c1 req-30f09e47-c685-4abe-9319-8ec271ba4805 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.200 187156 DEBUG oslo_concurrency.lockutils [req-186c9bb1-3a30-4a30-903c-0f07d4e6c8c1 req-30f09e47-c685-4abe-9319-8ec271ba4805 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.201 187156 DEBUG oslo_concurrency.lockutils [req-186c9bb1-3a30-4a30-903c-0f07d4e6c8c1 req-30f09e47-c685-4abe-9319-8ec271ba4805 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.201 187156 DEBUG nova.compute.manager [req-186c9bb1-3a30-4a30-903c-0f07d4e6c8c1 req-30f09e47-c685-4abe-9319-8ec271ba4805 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] No waiting events found dispatching network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.201 187156 WARNING nova.compute.manager [req-186c9bb1-3a30-4a30-903c-0f07d4e6c8c1 req-30f09e47-c685-4abe-9319-8ec271ba4805 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received unexpected event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:04 np0005539504 podman[222542]: 2025-11-29 07:01:04.722050446 +0000 UTC m=+0.057988811 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:01:04 np0005539504 podman[222543]: 2025-11-29 07:01:04.73399966 +0000 UTC m=+0.065564567 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 29 02:01:04 np0005539504 nova_compute[187152]: 2025-11-29 07:01:04.930 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:09 np0005539504 nova_compute[187152]: 2025-11-29 07:01:09.517 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:09 np0005539504 podman[222587]: 2025-11-29 07:01:09.707672752 +0000 UTC m=+0.052794961 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:01:09 np0005539504 podman[222588]: 2025-11-29 07:01:09.746258756 +0000 UTC m=+0.087028647 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:01:09 np0005539504 nova_compute[187152]: 2025-11-29 07:01:09.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:14 np0005539504 nova_compute[187152]: 2025-11-29 07:01:14.577 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:14 np0005539504 nova_compute[187152]: 2025-11-29 07:01:14.934 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:15Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:5f:f9 10.100.0.5
Nov 29 02:01:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:15Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:5f:f9 10.100.0.5
Nov 29 02:01:16 np0005539504 podman[222648]: 2025-11-29 07:01:16.726474098 +0000 UTC m=+0.061758243 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 29 02:01:19 np0005539504 nova_compute[187152]: 2025-11-29 07:01:19.578 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:19 np0005539504 nova_compute[187152]: 2025-11-29 07:01:19.937 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:21 np0005539504 podman[222668]: 2025-11-29 07:01:21.739393492 +0000 UTC m=+0.070876551 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:01:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:22.915 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:22.916 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:22.917 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:24 np0005539504 nova_compute[187152]: 2025-11-29 07:01:24.580 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:24 np0005539504 nova_compute[187152]: 2025-11-29 07:01:24.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:29 np0005539504 nova_compute[187152]: 2025-11-29 07:01:29.581 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:29 np0005539504 nova_compute[187152]: 2025-11-29 07:01:29.941 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.345 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.346 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.363 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.493 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.493 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.502 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.502 187156 INFO nova.compute.claims [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.684 187156 DEBUG nova.compute.provider_tree [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.704 187156 DEBUG nova.scheduler.client.report [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.729 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.731 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:01:31 np0005539504 podman[222687]: 2025-11-29 07:01:31.747232066 +0000 UTC m=+0.090490832 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.805 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.806 187156 DEBUG nova.network.neutron [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.826 187156 INFO nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.850 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.993 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.996 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:01:31 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.998 187156 INFO nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Creating image(s)#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:31.999 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "/var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.000 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.001 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.022 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.090 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.092 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.094 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.114 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.173 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.175 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.212 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.213 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.214 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.265 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.266 187156 DEBUG nova.virt.disk.api [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Checking if we can resize image /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.267 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.322 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.323 187156 DEBUG nova.virt.disk.api [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Cannot resize image /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.324 187156 DEBUG nova.objects.instance [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'migration_context' on Instance uuid 8f92b94f-11a8-44de-b605-397f29484586 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.341 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.341 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Ensure instance console log exists: /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.342 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.342 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.343 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:32 np0005539504 nova_compute[187152]: 2025-11-29 07:01:32.968 187156 DEBUG nova.policy [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:01:34 np0005539504 nova_compute[187152]: 2025-11-29 07:01:34.584 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:34 np0005539504 nova_compute[187152]: 2025-11-29 07:01:34.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:35 np0005539504 podman[222721]: 2025-11-29 07:01:35.707528938 +0000 UTC m=+0.051613595 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:01:35 np0005539504 podman[222722]: 2025-11-29 07:01:35.714809115 +0000 UTC m=+0.054622997 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Nov 29 02:01:35 np0005539504 nova_compute[187152]: 2025-11-29 07:01:35.935 187156 DEBUG nova.network.neutron [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Successfully created port: 8b71ee8e-ab95-47c7-a203-015aac168d4f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:01:38 np0005539504 nova_compute[187152]: 2025-11-29 07:01:38.103 187156 DEBUG nova.network.neutron [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Successfully updated port: 8b71ee8e-ab95-47c7-a203-015aac168d4f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:01:38 np0005539504 nova_compute[187152]: 2025-11-29 07:01:38.120 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:38 np0005539504 nova_compute[187152]: 2025-11-29 07:01:38.120 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquired lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:38 np0005539504 nova_compute[187152]: 2025-11-29 07:01:38.121 187156 DEBUG nova.network.neutron [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:01:38 np0005539504 nova_compute[187152]: 2025-11-29 07:01:38.893 187156 DEBUG nova.network.neutron [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:01:39 np0005539504 nova_compute[187152]: 2025-11-29 07:01:39.059 187156 DEBUG nova.compute.manager [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-changed-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:39 np0005539504 nova_compute[187152]: 2025-11-29 07:01:39.059 187156 DEBUG nova.compute.manager [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Refreshing instance network info cache due to event network-changed-8b71ee8e-ab95-47c7-a203-015aac168d4f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:01:39 np0005539504 nova_compute[187152]: 2025-11-29 07:01:39.059 187156 DEBUG oslo_concurrency.lockutils [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:39 np0005539504 nova_compute[187152]: 2025-11-29 07:01:39.586 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:39 np0005539504 nova_compute[187152]: 2025-11-29 07:01:39.946 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.163 187156 DEBUG nova.network.neutron [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updating instance_info_cache with network_info: [{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.188 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Releasing lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.188 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Instance network_info: |[{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.189 187156 DEBUG oslo_concurrency.lockutils [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.190 187156 DEBUG nova.network.neutron [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Refreshing network info cache for port 8b71ee8e-ab95-47c7-a203-015aac168d4f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.195 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Start _get_guest_xml network_info=[{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.203 187156 WARNING nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.211 187156 DEBUG nova.virt.libvirt.host [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.213 187156 DEBUG nova.virt.libvirt.host [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.223 187156 DEBUG nova.virt.libvirt.host [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.223 187156 DEBUG nova.virt.libvirt.host [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.225 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.225 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.225 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.225 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.226 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.226 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.226 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.226 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.227 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.227 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.227 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.227 187156 DEBUG nova.virt.hardware [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.232 187156 DEBUG nova.virt.libvirt.vif [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1913994908',display_name='tempest-ServersAdminTestJSON-server-1913994908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1913994908',id=58,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-nnaudwbx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:31Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=8f92b94f-11a8-44de-b605-397f29484586,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.233 187156 DEBUG nova.network.os_vif_util [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.233 187156 DEBUG nova.network.os_vif_util [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.234 187156 DEBUG nova.objects.instance [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8f92b94f-11a8-44de-b605-397f29484586 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.250 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <uuid>8f92b94f-11a8-44de-b605-397f29484586</uuid>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <name>instance-0000003a</name>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAdminTestJSON-server-1913994908</nova:name>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:01:40</nova:creationTime>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:user uuid="cd616d4c2eb44fe0a0da2df1690c0e21">tempest-ServersAdminTestJSON-1087744064-project-member</nova:user>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:project uuid="80b4126e17a14d73b40158a57f19d091">tempest-ServersAdminTestJSON-1087744064</nova:project>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        <nova:port uuid="8b71ee8e-ab95-47c7-a203-015aac168d4f">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <entry name="serial">8f92b94f-11a8-44de-b605-397f29484586</entry>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <entry name="uuid">8f92b94f-11a8-44de-b605-397f29484586</entry>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.config"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:47:74:16"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <target dev="tap8b71ee8e-ab"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/console.log" append="off"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:01:40 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:01:40 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:01:40 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:01:40 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.252 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Preparing to wait for external event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.252 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.253 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.253 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.254 187156 DEBUG nova.virt.libvirt.vif [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1913994908',display_name='tempest-ServersAdminTestJSON-server-1913994908',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1913994908',id=58,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-nnaudwbx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:31Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=8f92b94f-11a8-44de-b605-397f29484586,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.254 187156 DEBUG nova.network.os_vif_util [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.255 187156 DEBUG nova.network.os_vif_util [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.256 187156 DEBUG os_vif [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.256 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.257 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.257 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.262 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.262 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b71ee8e-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.263 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b71ee8e-ab, col_values=(('external_ids', {'iface-id': '8b71ee8e-ab95-47c7-a203-015aac168d4f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:74:16', 'vm-uuid': '8f92b94f-11a8-44de-b605-397f29484586'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.264 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:40 np0005539504 NetworkManager[55210]: <info>  [1764399700.2659] manager: (tap8b71ee8e-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.266 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.274 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.275 187156 INFO os_vif [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab')#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.335 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.335 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.336 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No VIF found with MAC fa:16:3e:47:74:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:01:40 np0005539504 nova_compute[187152]: 2025-11-29 07:01:40.336 187156 INFO nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Using config drive#033[00m
Nov 29 02:01:40 np0005539504 podman[222768]: 2025-11-29 07:01:40.390702574 +0000 UTC m=+0.077160485 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:01:40 np0005539504 podman[222767]: 2025-11-29 07:01:40.397171209 +0000 UTC m=+0.086350794 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.089 187156 INFO nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Creating config drive at /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.config#033[00m
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.094 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3dwjklyo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.220 187156 DEBUG oslo_concurrency.processutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3dwjklyo" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:41 np0005539504 NetworkManager[55210]: <info>  [1764399701.2884] manager: (tap8b71ee8e-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Nov 29 02:01:41 np0005539504 kernel: tap8b71ee8e-ab: entered promiscuous mode
Nov 29 02:01:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:41Z|00179|binding|INFO|Claiming lport 8b71ee8e-ab95-47c7-a203-015aac168d4f for this chassis.
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.318 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:41Z|00180|binding|INFO|8b71ee8e-ab95-47c7-a203-015aac168d4f: Claiming fa:16:3e:47:74:16 10.100.0.6
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.329 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:74:16 10.100.0.6'], port_security=['fa:16:3e:47:74:16 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8f92b94f-11a8-44de-b605-397f29484586', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '2', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=8b71ee8e-ab95-47c7-a203-015aac168d4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.331 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 8b71ee8e-ab95-47c7-a203-015aac168d4f in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.333 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:01:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:41Z|00181|binding|INFO|Setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f ovn-installed in OVS
Nov 29 02:01:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:41Z|00182|binding|INFO|Setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f up in Southbound
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.335 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:41 np0005539504 systemd-udevd[222834]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.354 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[86ae5074-a84f-4efe-a03e-4656b1f4ca89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:41 np0005539504 systemd-machined[153423]: New machine qemu-29-instance-0000003a.
Nov 29 02:01:41 np0005539504 NetworkManager[55210]: <info>  [1764399701.3661] device (tap8b71ee8e-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:01:41 np0005539504 NetworkManager[55210]: <info>  [1764399701.3677] device (tap8b71ee8e-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:01:41 np0005539504 systemd[1]: Started Virtual Machine qemu-29-instance-0000003a.
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.392 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e8fd3302-ba4e-49e7-8112-dc884b5e6550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.396 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[87c42b81-c56d-4358-9316-b0d991284be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.422 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[31fed0ee-1f72-4449-a3d0-6cb3dcd99ba3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.440 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[912afb63-9f1c-4d0e-9ec4-caf56c4578f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512435, 'reachable_time': 28877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222847, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.458 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd26d82-be1b-4425-b832-a2ecccbe1bf7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512447, 'tstamp': 512447}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222848, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512450, 'tstamp': 512450}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222848, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.460 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.462 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:41 np0005539504 nova_compute[187152]: 2025-11-29 07:01:41.463 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.465 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.465 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.466 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:41.466 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.177 187156 DEBUG nova.compute.manager [req-7a2d9b31-cca9-4995-812e-7a1099fb3278 req-d16f53d0-dee1-4f65-b5a4-a60063b5cce3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.179 187156 DEBUG oslo_concurrency.lockutils [req-7a2d9b31-cca9-4995-812e-7a1099fb3278 req-d16f53d0-dee1-4f65-b5a4-a60063b5cce3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.179 187156 DEBUG oslo_concurrency.lockutils [req-7a2d9b31-cca9-4995-812e-7a1099fb3278 req-d16f53d0-dee1-4f65-b5a4-a60063b5cce3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.179 187156 DEBUG oslo_concurrency.lockutils [req-7a2d9b31-cca9-4995-812e-7a1099fb3278 req-d16f53d0-dee1-4f65-b5a4-a60063b5cce3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.180 187156 DEBUG nova.compute.manager [req-7a2d9b31-cca9-4995-812e-7a1099fb3278 req-d16f53d0-dee1-4f65-b5a4-a60063b5cce3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Processing event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.765 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.766 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399702.765102, 8f92b94f-11a8-44de-b605-397f29484586 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.767 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] VM Started (Lifecycle Event)#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.770 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.773 187156 INFO nova.virt.libvirt.driver [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Instance spawned successfully.#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.774 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.792 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.799 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.805 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.805 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.806 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.806 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.807 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.807 187156 DEBUG nova.virt.libvirt.driver [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.843 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.843 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399702.765253, 8f92b94f-11a8-44de-b605-397f29484586 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.844 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.875 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.879 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399702.7694473, 8f92b94f-11a8-44de-b605-397f29484586 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.879 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.887 187156 DEBUG nova.network.neutron [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updated VIF entry in instance network info cache for port 8b71ee8e-ab95-47c7-a203-015aac168d4f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.888 187156 DEBUG nova.network.neutron [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updating instance_info_cache with network_info: [{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.910 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.912 187156 DEBUG oslo_concurrency.lockutils [req-3bf960f7-d178-4bf5-9ff7-486afb9721dd req-cc79ba6a-a9b4-47c7-b5b6-cc0be3e47290 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.914 187156 INFO nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Took 10.92 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.914 187156 DEBUG nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.916 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:42 np0005539504 nova_compute[187152]: 2025-11-29 07:01:42.948 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:43 np0005539504 nova_compute[187152]: 2025-11-29 07:01:43.172 187156 INFO nova.compute.manager [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Took 11.74 seconds to build instance.#033[00m
Nov 29 02:01:43 np0005539504 nova_compute[187152]: 2025-11-29 07:01:43.208 187156 DEBUG oslo_concurrency.lockutils [None req-f865e68e-2a69-4eef-9a4e-0b139bb28a7c cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:43.519 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:43.521 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:01:43 np0005539504 nova_compute[187152]: 2025-11-29 07:01:43.562 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:43 np0005539504 nova_compute[187152]: 2025-11-29 07:01:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.301 187156 DEBUG nova.compute.manager [req-4f9c5ea2-0213-4ca5-a5fc-09f80cd6b589 req-9ddf567a-c183-4610-9c6e-f735b7b5a061 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.302 187156 DEBUG oslo_concurrency.lockutils [req-4f9c5ea2-0213-4ca5-a5fc-09f80cd6b589 req-9ddf567a-c183-4610-9c6e-f735b7b5a061 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.302 187156 DEBUG oslo_concurrency.lockutils [req-4f9c5ea2-0213-4ca5-a5fc-09f80cd6b589 req-9ddf567a-c183-4610-9c6e-f735b7b5a061 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.303 187156 DEBUG oslo_concurrency.lockutils [req-4f9c5ea2-0213-4ca5-a5fc-09f80cd6b589 req-9ddf567a-c183-4610-9c6e-f735b7b5a061 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.303 187156 DEBUG nova.compute.manager [req-4f9c5ea2-0213-4ca5-a5fc-09f80cd6b589 req-9ddf567a-c183-4610-9c6e-f735b7b5a061 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.303 187156 WARNING nova.compute.manager [req-4f9c5ea2-0213-4ca5-a5fc-09f80cd6b589 req-9ddf567a-c183-4610-9c6e-f735b7b5a061 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received unexpected event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with vm_state active and task_state None.#033[00m
Nov 29 02:01:44 np0005539504 nova_compute[187152]: 2025-11-29 07:01:44.587 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.088 187156 DEBUG oslo_concurrency.lockutils [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] Acquiring lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.089 187156 DEBUG oslo_concurrency.lockutils [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] Acquired lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.089 187156 DEBUG nova.network.neutron [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:45 np0005539504 nova_compute[187152]: 2025-11-29 07:01:45.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:01:46.523 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:47 np0005539504 podman[222857]: 2025-11-29 07:01:47.737416969 +0000 UTC m=+0.083317841 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 02:01:47 np0005539504 nova_compute[187152]: 2025-11-29 07:01:47.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:48 np0005539504 nova_compute[187152]: 2025-11-29 07:01:48.407 187156 DEBUG nova.network.neutron [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updating instance_info_cache with network_info: [{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:48 np0005539504 nova_compute[187152]: 2025-11-29 07:01:48.481 187156 DEBUG oslo_concurrency.lockutils [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] Releasing lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:48 np0005539504 nova_compute[187152]: 2025-11-29 07:01:48.482 187156 DEBUG nova.compute.manager [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Nov 29 02:01:48 np0005539504 nova_compute[187152]: 2025-11-29 07:01:48.482 187156 DEBUG nova.compute.manager [None req-65f489ae-cf60-40bf-a110-ca217fdb8281 b2f9d3dbf43b4ee094f4f19650a02efb 4805815c23534f53b2d27c447c3dc15e - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] network_info to inject: |[{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Nov 29 02:01:49 np0005539504 nova_compute[187152]: 2025-11-29 07:01:49.590 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.005 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.006 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.006 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.267 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.296 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.297 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.297 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:01:50 np0005539504 nova_compute[187152]: 2025-11-29 07:01:50.297 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 63f8497a-eaf6-45ec-a251-92e7903aa297 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.822 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Updating instance_info_cache with network_info: [{"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.855 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-63f8497a-eaf6-45ec-a251-92e7903aa297" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.855 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.856 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.857 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.952 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:51 np0005539504 nova_compute[187152]: 2025-11-29 07:01:51.953 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:01:52 np0005539504 podman[222879]: 2025-11-29 07:01:52.780343244 +0000 UTC m=+0.089579352 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:01:52 np0005539504 nova_compute[187152]: 2025-11-29 07:01:52.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.025 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.025 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.026 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.026 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.111 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.179 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.181 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.252 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.260 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.334 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.335 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.402 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.596 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.598 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5431MB free_disk=73.17096328735352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.599 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.599 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.926 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 63f8497a-eaf6-45ec-a251-92e7903aa297 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.927 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 8f92b94f-11a8-44de-b605-397f29484586 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.927 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:01:53 np0005539504 nova_compute[187152]: 2025-11-29 07:01:53.927 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.429 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.447 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.471 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.471 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.872s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.970 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.971 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:54 np0005539504 nova_compute[187152]: 2025-11-29 07:01:54.971 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:01:55 np0005539504 nova_compute[187152]: 2025-11-29 07:01:55.096 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:01:55 np0005539504 nova_compute[187152]: 2025-11-29 07:01:55.271 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.142 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.142 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.158 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.261 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.262 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.267 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.268 187156 INFO nova.compute.claims [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.455 187156 DEBUG nova.compute.provider_tree [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.471 187156 DEBUG nova.scheduler.client.report [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.495 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.496 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.971 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:01:56 np0005539504 nova_compute[187152]: 2025-11-29 07:01:56.971 187156 DEBUG nova.network.neutron [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:01:57 np0005539504 nova_compute[187152]: 2025-11-29 07:01:57.176 187156 INFO nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:01:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:57Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:74:16 10.100.0.6
Nov 29 02:01:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:01:57Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:74:16 10.100.0.6
Nov 29 02:01:57 np0005539504 nova_compute[187152]: 2025-11-29 07:01:57.340 187156 DEBUG nova.policy [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:01:57 np0005539504 nova_compute[187152]: 2025-11-29 07:01:57.408 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.011 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.012 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.012 187156 INFO nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Creating image(s)#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.013 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.013 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.014 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.026 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.096 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.098 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.099 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.120 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.178 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.179 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.645 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.878 187156 DEBUG nova.network.neutron [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Successfully created port: 414eaffc-4b41-4a38-b224-37a122d319f3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.882 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk 1073741824" returned: 0 in 0.703s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.883 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.883 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.943 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.944 187156 DEBUG nova.virt.disk.api [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Checking if we can resize image /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:01:59 np0005539504 nova_compute[187152]: 2025-11-29 07:01:59.945 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.008 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.009 187156 DEBUG nova.virt.disk.api [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Cannot resize image /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.009 187156 DEBUG nova.objects.instance [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'migration_context' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.135 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.135 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Ensure instance console log exists: /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.136 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.136 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.136 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:00 np0005539504 nova_compute[187152]: 2025-11-29 07:02:00.272 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:02 np0005539504 podman[222953]: 2025-11-29 07:02:02.733810656 +0000 UTC m=+0.060017993 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.140 187156 DEBUG nova.network.neutron [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Successfully updated port: 414eaffc-4b41-4a38-b224-37a122d319f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.169 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.169 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.169 187156 DEBUG nova.network.neutron [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.254 187156 DEBUG nova.compute.manager [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.254 187156 DEBUG nova.compute.manager [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing instance network info cache due to event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.254 187156 DEBUG oslo_concurrency.lockutils [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.647 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:04 np0005539504 nova_compute[187152]: 2025-11-29 07:02:04.861 187156 DEBUG nova.network.neutron [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:02:05 np0005539504 nova_compute[187152]: 2025-11-29 07:02:05.275 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:06 np0005539504 podman[222972]: 2025-11-29 07:02:06.7248222 +0000 UTC m=+0.060132645 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:02:06 np0005539504 podman[222973]: 2025-11-29 07:02:06.734487392 +0000 UTC m=+0.065127250 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6)
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.026 187156 DEBUG nova.network.neutron [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.264 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.264 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance network_info: |[{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.264 187156 DEBUG oslo_concurrency.lockutils [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.265 187156 DEBUG nova.network.neutron [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.267 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Start _get_guest_xml network_info=[{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.271 187156 WARNING nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.278 187156 DEBUG nova.virt.libvirt.host [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.279 187156 DEBUG nova.virt.libvirt.host [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.282 187156 DEBUG nova.virt.libvirt.host [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.282 187156 DEBUG nova.virt.libvirt.host [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.284 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.284 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.284 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.284 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.284 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.285 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.285 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.285 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.285 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.285 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.285 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.286 187156 DEBUG nova.virt.hardware [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.289 187156 DEBUG nova.virt.libvirt.vif [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-108965523',display_name='tempest-ServerRescueTestJSONUnderV235-server-108965523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-108965523',id=60,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='633b56d013a1402db852e38aa4180fda',ramdisk_id='',reservation_id='r-9sepmc2z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-433902945',owner_user_name='tempest-ServerRescueTestJSONUnderV235-433902945-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:57Z,user_data=None,user_id='6bde44f29f11433fbe4f716d75f87b96',uuid=5f13f55a-3763-4482-9248-7c3a4cf2e40d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.289 187156 DEBUG nova.network.os_vif_util [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converting VIF {"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.290 187156 DEBUG nova.network.os_vif_util [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:07 np0005539504 nova_compute[187152]: 2025-11-29 07:02:07.290 187156 DEBUG nova.objects.instance [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.091 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <uuid>5f13f55a-3763-4482-9248-7c3a4cf2e40d</uuid>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <name>instance-0000003c</name>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-108965523</nova:name>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:02:07</nova:creationTime>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:user uuid="6bde44f29f11433fbe4f716d75f87b96">tempest-ServerRescueTestJSONUnderV235-433902945-project-member</nova:user>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:project uuid="633b56d013a1402db852e38aa4180fda">tempest-ServerRescueTestJSONUnderV235-433902945</nova:project>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        <nova:port uuid="414eaffc-4b41-4a38-b224-37a122d319f3">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <entry name="serial">5f13f55a-3763-4482-9248-7c3a4cf2e40d</entry>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <entry name="uuid">5f13f55a-3763-4482-9248-7c3a4cf2e40d</entry>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:64:4a:7f"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <target dev="tap414eaffc-4b"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/console.log" append="off"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:02:08 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:02:08 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:02:08 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:02:08 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.091 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Preparing to wait for external event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.092 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.092 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.092 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.093 187156 DEBUG nova.virt.libvirt.vif [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-108965523',display_name='tempest-ServerRescueTestJSONUnderV235-server-108965523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-108965523',id=60,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='633b56d013a1402db852e38aa4180fda',ramdisk_id='',reservation_id='r-9sepmc2z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-433902945',owner_user_name='tempest-ServerRescueTestJSONUnderV235-433902945-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:57Z,user_data=None,user_id='6bde44f29f11433fbe4f716d75f87b96',uuid=5f13f55a-3763-4482-9248-7c3a4cf2e40d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.093 187156 DEBUG nova.network.os_vif_util [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converting VIF {"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.093 187156 DEBUG nova.network.os_vif_util [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.094 187156 DEBUG os_vif [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.095 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.095 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.096 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.098 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.099 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap414eaffc-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.099 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap414eaffc-4b, col_values=(('external_ids', {'iface-id': '414eaffc-4b41-4a38-b224-37a122d319f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:4a:7f', 'vm-uuid': '5f13f55a-3763-4482-9248-7c3a4cf2e40d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.100 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:08 np0005539504 NetworkManager[55210]: <info>  [1764399728.1019] manager: (tap414eaffc-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.104 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.109 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:08 np0005539504 nova_compute[187152]: 2025-11-29 07:02:08.110 187156 INFO os_vif [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b')#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.205 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.206 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.206 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No VIF found with MAC fa:16:3e:64:4a:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.207 187156 INFO nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Using config drive#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.348 187156 DEBUG nova.network.neutron [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updated VIF entry in instance network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.349 187156 DEBUG nova.network.neutron [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.424 187156 DEBUG oslo_concurrency.lockutils [req-089d99eb-b35d-4f72-8efb-eda0cef0823a req-4c281253-2f51-4e35-a0a3-1ea2260d984b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.649 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.767 187156 INFO nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Creating config drive at /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.772 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvzkg_opi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.897 187156 DEBUG oslo_concurrency.processutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvzkg_opi" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:09 np0005539504 NetworkManager[55210]: <info>  [1764399729.9520] manager: (tap414eaffc-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Nov 29 02:02:09 np0005539504 kernel: tap414eaffc-4b: entered promiscuous mode
Nov 29 02:02:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:09Z|00183|binding|INFO|Claiming lport 414eaffc-4b41-4a38-b224-37a122d319f3 for this chassis.
Nov 29 02:02:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:09Z|00184|binding|INFO|414eaffc-4b41-4a38-b224-37a122d319f3: Claiming fa:16:3e:64:4a:7f 10.100.0.9
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.954 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539504 nova_compute[187152]: 2025-11-29 07:02:09.957 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:09.968 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:4a:7f 10.100.0.9'], port_security=['fa:16:3e:64:4a:7f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a037afc-c347-4059-964e-c318838fb9e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '633b56d013a1402db852e38aa4180fda', 'neutron:revision_number': '2', 'neutron:security_group_ids': '28fbae92-3ce1-46eb-a136-c9b63071c210', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=464f2dc0-d054-4bcf-a076-ac8d2e495428, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=414eaffc-4b41-4a38-b224-37a122d319f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:09.969 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 414eaffc-4b41-4a38-b224-37a122d319f3 in datapath 6a037afc-c347-4059-964e-c318838fb9e7 bound to our chassis#033[00m
Nov 29 02:02:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:09.971 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a037afc-c347-4059-964e-c318838fb9e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:02:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:09.973 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d6064442-6311-4fda-8777-df7e918d7523]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539504 systemd-udevd[223034]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:02:09 np0005539504 systemd-machined[153423]: New machine qemu-30-instance-0000003c.
Nov 29 02:02:10 np0005539504 NetworkManager[55210]: <info>  [1764399730.0030] device (tap414eaffc-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:02:10 np0005539504 NetworkManager[55210]: <info>  [1764399730.0039] device (tap414eaffc-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:02:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:10Z|00185|binding|INFO|Setting lport 414eaffc-4b41-4a38-b224-37a122d319f3 ovn-installed in OVS
Nov 29 02:02:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:10Z|00186|binding|INFO|Setting lport 414eaffc-4b41-4a38-b224-37a122d319f3 up in Southbound
Nov 29 02:02:10 np0005539504 systemd[1]: Started Virtual Machine qemu-30-instance-0000003c.
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.290 187156 DEBUG nova.compute.manager [req-bfb0e52d-d6cf-46c9-ae60-9fbd79f604e2 req-6041faf9-6f76-4b80-ba47-1e3ec2b7121f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.290 187156 DEBUG oslo_concurrency.lockutils [req-bfb0e52d-d6cf-46c9-ae60-9fbd79f604e2 req-6041faf9-6f76-4b80-ba47-1e3ec2b7121f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.291 187156 DEBUG oslo_concurrency.lockutils [req-bfb0e52d-d6cf-46c9-ae60-9fbd79f604e2 req-6041faf9-6f76-4b80-ba47-1e3ec2b7121f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.291 187156 DEBUG oslo_concurrency.lockutils [req-bfb0e52d-d6cf-46c9-ae60-9fbd79f604e2 req-6041faf9-6f76-4b80-ba47-1e3ec2b7121f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.292 187156 DEBUG nova.compute.manager [req-bfb0e52d-d6cf-46c9-ae60-9fbd79f604e2 req-6041faf9-6f76-4b80-ba47-1e3ec2b7121f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Processing event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.392 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.393 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399730.391714, 5f13f55a-3763-4482-9248-7c3a4cf2e40d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.393 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] VM Started (Lifecycle Event)#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.397 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.402 187156 INFO nova.virt.libvirt.driver [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance spawned successfully.#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.402 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.426 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.431 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.434 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.434 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.435 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.435 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.436 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.436 187156 DEBUG nova.virt.libvirt.driver [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.473 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.473 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399730.3963346, 5f13f55a-3763-4482-9248-7c3a4cf2e40d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.474 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.509 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.513 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399730.3967505, 5f13f55a-3763-4482-9248-7c3a4cf2e40d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.514 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.545 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.546 187156 INFO nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Took 11.53 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.546 187156 DEBUG nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.551 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.578 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.640 187156 INFO nova.compute.manager [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Took 14.42 seconds to build instance.#033[00m
Nov 29 02:02:10 np0005539504 nova_compute[187152]: 2025-11-29 07:02:10.662 187156 DEBUG oslo_concurrency.lockutils [None req-f967dab8-da33-4301-82d8-e0f1bf851416 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:10 np0005539504 podman[223051]: 2025-11-29 07:02:10.730787309 +0000 UTC m=+0.065648895 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:02:10 np0005539504 podman[223052]: 2025-11-29 07:02:10.763106642 +0000 UTC m=+0.095740517 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:02:11 np0005539504 nova_compute[187152]: 2025-11-29 07:02:11.839 187156 INFO nova.compute.manager [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Rescuing#033[00m
Nov 29 02:02:11 np0005539504 nova_compute[187152]: 2025-11-29 07:02:11.839 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:11 np0005539504 nova_compute[187152]: 2025-11-29 07:02:11.840 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:11 np0005539504 nova_compute[187152]: 2025-11-29 07:02:11.840 187156 DEBUG nova.network.neutron [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:02:12 np0005539504 nova_compute[187152]: 2025-11-29 07:02:12.418 187156 DEBUG nova.compute.manager [req-d89a6c69-f484-4786-a0b1-ead1f62ad39e req-611acea8-e014-475d-988f-2e1b115ddaf1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:12 np0005539504 nova_compute[187152]: 2025-11-29 07:02:12.418 187156 DEBUG oslo_concurrency.lockutils [req-d89a6c69-f484-4786-a0b1-ead1f62ad39e req-611acea8-e014-475d-988f-2e1b115ddaf1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:12 np0005539504 nova_compute[187152]: 2025-11-29 07:02:12.419 187156 DEBUG oslo_concurrency.lockutils [req-d89a6c69-f484-4786-a0b1-ead1f62ad39e req-611acea8-e014-475d-988f-2e1b115ddaf1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:12 np0005539504 nova_compute[187152]: 2025-11-29 07:02:12.419 187156 DEBUG oslo_concurrency.lockutils [req-d89a6c69-f484-4786-a0b1-ead1f62ad39e req-611acea8-e014-475d-988f-2e1b115ddaf1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:12 np0005539504 nova_compute[187152]: 2025-11-29 07:02:12.419 187156 DEBUG nova.compute.manager [req-d89a6c69-f484-4786-a0b1-ead1f62ad39e req-611acea8-e014-475d-988f-2e1b115ddaf1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:12 np0005539504 nova_compute[187152]: 2025-11-29 07:02:12.420 187156 WARNING nova.compute.manager [req-d89a6c69-f484-4786-a0b1-ead1f62ad39e req-611acea8-e014-475d-988f-2e1b115ddaf1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received unexpected event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:02:13 np0005539504 nova_compute[187152]: 2025-11-29 07:02:13.101 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:13 np0005539504 nova_compute[187152]: 2025-11-29 07:02:13.381 187156 DEBUG nova.network.neutron [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:13 np0005539504 nova_compute[187152]: 2025-11-29 07:02:13.414 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:13 np0005539504 nova_compute[187152]: 2025-11-29 07:02:13.745 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:02:14 np0005539504 nova_compute[187152]: 2025-11-29 07:02:14.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:18 np0005539504 nova_compute[187152]: 2025-11-29 07:02:18.104 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:18 np0005539504 podman[223100]: 2025-11-29 07:02:18.719104478 +0000 UTC m=+0.064246576 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:02:19 np0005539504 nova_compute[187152]: 2025-11-29 07:02:19.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:22.916 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:22.917 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:22.918 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:23 np0005539504 nova_compute[187152]: 2025-11-29 07:02:23.107 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:23 np0005539504 podman[223145]: 2025-11-29 07:02:23.735924607 +0000 UTC m=+0.071078981 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 02:02:24 np0005539504 nova_compute[187152]: 2025-11-29 07:02:24.079 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:02:24 np0005539504 nova_compute[187152]: 2025-11-29 07:02:24.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:28 np0005539504 nova_compute[187152]: 2025-11-29 07:02:28.110 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:29 np0005539504 nova_compute[187152]: 2025-11-29 07:02:29.655 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:30 np0005539504 kernel: tap414eaffc-4b (unregistering): left promiscuous mode
Nov 29 02:02:30 np0005539504 NetworkManager[55210]: <info>  [1764399750.4698] device (tap414eaffc-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:02:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:30Z|00187|binding|INFO|Releasing lport 414eaffc-4b41-4a38-b224-37a122d319f3 from this chassis (sb_readonly=0)
Nov 29 02:02:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:30Z|00188|binding|INFO|Setting lport 414eaffc-4b41-4a38-b224-37a122d319f3 down in Southbound
Nov 29 02:02:30 np0005539504 nova_compute[187152]: 2025-11-29 07:02:30.479 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:30Z|00189|binding|INFO|Removing iface tap414eaffc-4b ovn-installed in OVS
Nov 29 02:02:30 np0005539504 nova_compute[187152]: 2025-11-29 07:02:30.482 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:30 np0005539504 nova_compute[187152]: 2025-11-29 07:02:30.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:30 np0005539504 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Nov 29 02:02:30 np0005539504 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003c.scope: Consumed 13.820s CPU time.
Nov 29 02:02:30 np0005539504 systemd-machined[153423]: Machine qemu-30-instance-0000003c terminated.
Nov 29 02:02:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:30.739 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:4a:7f 10.100.0.9'], port_security=['fa:16:3e:64:4a:7f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a037afc-c347-4059-964e-c318838fb9e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '633b56d013a1402db852e38aa4180fda', 'neutron:revision_number': '4', 'neutron:security_group_ids': '28fbae92-3ce1-46eb-a136-c9b63071c210', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=464f2dc0-d054-4bcf-a076-ac8d2e495428, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=414eaffc-4b41-4a38-b224-37a122d319f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:30.740 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 414eaffc-4b41-4a38-b224-37a122d319f3 in datapath 6a037afc-c347-4059-964e-c318838fb9e7 unbound from our chassis#033[00m
Nov 29 02:02:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:30.741 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a037afc-c347-4059-964e-c318838fb9e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:02:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:30.742 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96fd1f43-924e-49b4-b601-c21b34aa0105]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.112 187156 INFO nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance shutdown successfully after 17 seconds.#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.117 187156 INFO nova.virt.libvirt.driver [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance destroyed successfully.#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.118 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'numa_topology' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.134 187156 INFO nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Attempting rescue#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.135 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.141 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.142 187156 INFO nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Creating image(s)#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.143 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.144 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.145 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.145 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.169 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.170 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.181 187156 DEBUG oslo_concurrency.processutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.250 187156 DEBUG oslo_concurrency.processutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.251 187156 DEBUG oslo_concurrency.processutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.290 187156 DEBUG oslo_concurrency.processutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.rescue" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.291 187156 DEBUG oslo_concurrency.lockutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.291 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'migration_context' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.307 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.308 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Start _get_guest_xml network_info=[{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "vif_mac": "fa:16:3e:64:4a:7f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.308 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'resources' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.329 187156 WARNING nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.338 187156 DEBUG nova.virt.libvirt.host [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.339 187156 DEBUG nova.virt.libvirt.host [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.344 187156 DEBUG nova.virt.libvirt.host [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.344 187156 DEBUG nova.virt.libvirt.host [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.346 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.346 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.347 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.347 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.347 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.348 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.348 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.348 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.348 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.349 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.349 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.349 187156 DEBUG nova.virt.hardware [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.349 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.372 187156 DEBUG nova.virt.libvirt.vif [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-108965523',display_name='tempest-ServerRescueTestJSONUnderV235-server-108965523',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-108965523',id=60,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:02:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='633b56d013a1402db852e38aa4180fda',ramdisk_id='',reservation_id='r-9sepmc2z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-433902945',owner_user_name='tempest-ServerRescueTestJSONUnderV235-433902945-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:02:10Z,user_data=None,user_id='6bde44f29f11433fbe4f716d75f87b96',uuid=5f13f55a-3763-4482-9248-7c3a4cf2e40d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "vif_mac": "fa:16:3e:64:4a:7f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.373 187156 DEBUG nova.network.os_vif_util [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converting VIF {"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "vif_mac": "fa:16:3e:64:4a:7f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.374 187156 DEBUG nova.network.os_vif_util [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.376 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.391 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <uuid>5f13f55a-3763-4482-9248-7c3a4cf2e40d</uuid>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <name>instance-0000003c</name>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-108965523</nova:name>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:02:31</nova:creationTime>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:user uuid="6bde44f29f11433fbe4f716d75f87b96">tempest-ServerRescueTestJSONUnderV235-433902945-project-member</nova:user>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:project uuid="633b56d013a1402db852e38aa4180fda">tempest-ServerRescueTestJSONUnderV235-433902945</nova:project>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        <nova:port uuid="414eaffc-4b41-4a38-b224-37a122d319f3">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <entry name="serial">5f13f55a-3763-4482-9248-7c3a4cf2e40d</entry>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <entry name="uuid">5f13f55a-3763-4482-9248-7c3a4cf2e40d</entry>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.rescue"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <target dev="vdb" bus="virtio"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config.rescue"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:64:4a:7f"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <target dev="tap414eaffc-4b"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/console.log" append="off"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:02:31 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:02:31 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:02:31 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:02:31 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.398 187156 INFO nova.virt.libvirt.driver [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance destroyed successfully.#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.450 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.450 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.450 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.451 187156 DEBUG nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] No VIF found with MAC fa:16:3e:64:4a:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.451 187156 INFO nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Using config drive#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.465 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.504 187156 DEBUG nova.objects.instance [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'keypairs' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.770 187156 DEBUG nova.compute.manager [req-6ff47170-d39b-45e8-bcdc-cc54949aa242 req-3fbfd9cd-f36f-4c5f-8578-a0150e1f68c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-unplugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.770 187156 DEBUG oslo_concurrency.lockutils [req-6ff47170-d39b-45e8-bcdc-cc54949aa242 req-3fbfd9cd-f36f-4c5f-8578-a0150e1f68c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.771 187156 DEBUG oslo_concurrency.lockutils [req-6ff47170-d39b-45e8-bcdc-cc54949aa242 req-3fbfd9cd-f36f-4c5f-8578-a0150e1f68c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.771 187156 DEBUG oslo_concurrency.lockutils [req-6ff47170-d39b-45e8-bcdc-cc54949aa242 req-3fbfd9cd-f36f-4c5f-8578-a0150e1f68c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.771 187156 DEBUG nova.compute.manager [req-6ff47170-d39b-45e8-bcdc-cc54949aa242 req-3fbfd9cd-f36f-4c5f-8578-a0150e1f68c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-unplugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.771 187156 WARNING nova.compute.manager [req-6ff47170-d39b-45e8-bcdc-cc54949aa242 req-3fbfd9cd-f36f-4c5f-8578-a0150e1f68c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received unexpected event network-vif-unplugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.960 187156 INFO nova.virt.libvirt.driver [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Creating config drive at /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config.rescue#033[00m
Nov 29 02:02:31 np0005539504 nova_compute[187152]: 2025-11-29 07:02:31.965 187156 DEBUG oslo_concurrency.processutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbamo1d8z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.091 187156 DEBUG oslo_concurrency.processutils [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbamo1d8z" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:32 np0005539504 kernel: tap414eaffc-4b: entered promiscuous mode
Nov 29 02:02:32 np0005539504 NetworkManager[55210]: <info>  [1764399752.1521] manager: (tap414eaffc-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Nov 29 02:02:32 np0005539504 systemd-udevd[223168]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.152 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:32Z|00190|binding|INFO|Claiming lport 414eaffc-4b41-4a38-b224-37a122d319f3 for this chassis.
Nov 29 02:02:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:32Z|00191|binding|INFO|414eaffc-4b41-4a38-b224-37a122d319f3: Claiming fa:16:3e:64:4a:7f 10.100.0.9
Nov 29 02:02:32 np0005539504 NetworkManager[55210]: <info>  [1764399752.1648] device (tap414eaffc-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:02:32 np0005539504 NetworkManager[55210]: <info>  [1764399752.1656] device (tap414eaffc-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:02:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:32Z|00192|binding|INFO|Setting lport 414eaffc-4b41-4a38-b224-37a122d319f3 up in Southbound
Nov 29 02:02:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:32.166 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:4a:7f 10.100.0.9'], port_security=['fa:16:3e:64:4a:7f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a037afc-c347-4059-964e-c318838fb9e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '633b56d013a1402db852e38aa4180fda', 'neutron:revision_number': '5', 'neutron:security_group_ids': '28fbae92-3ce1-46eb-a136-c9b63071c210', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=464f2dc0-d054-4bcf-a076-ac8d2e495428, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=414eaffc-4b41-4a38-b224-37a122d319f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.166 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:32.167 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 414eaffc-4b41-4a38-b224-37a122d319f3 in datapath 6a037afc-c347-4059-964e-c318838fb9e7 bound to our chassis#033[00m
Nov 29 02:02:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:32.168 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a037afc-c347-4059-964e-c318838fb9e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:02:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:32Z|00193|binding|INFO|Setting lport 414eaffc-4b41-4a38-b224-37a122d319f3 ovn-installed in OVS
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.169 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:32.169 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[88fb91c7-cd69-4211-8994-976c6f555dbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.172 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:32 np0005539504 systemd-machined[153423]: New machine qemu-31-instance-0000003c.
Nov 29 02:02:32 np0005539504 systemd[1]: Started Virtual Machine qemu-31-instance-0000003c.
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.499 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 5f13f55a-3763-4482-9248-7c3a4cf2e40d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.499 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399752.498859, 5f13f55a-3763-4482-9248-7c3a4cf2e40d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.500 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.716 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.720 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.781 187156 DEBUG nova.compute.manager [None req-b6920942-18a7-4f85-ba6a-d97d9a160563 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.790 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.790 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399752.5042334, 5f13f55a-3763-4482-9248-7c3a4cf2e40d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.790 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] VM Started (Lifecycle Event)#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.817 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.820 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:02:32 np0005539504 nova_compute[187152]: 2025-11-29 07:02:32.874 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.111 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:33 np0005539504 podman[223230]: 2025-11-29 07:02:33.743062628 +0000 UTC m=+0.073724523 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.846 187156 DEBUG nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.847 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.847 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.847 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.847 187156 DEBUG nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.848 187156 WARNING nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received unexpected event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with vm_state rescued and task_state None.#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.848 187156 DEBUG nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.848 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.848 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.848 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.848 187156 DEBUG nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.849 187156 WARNING nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received unexpected event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with vm_state rescued and task_state None.#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.849 187156 DEBUG nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.849 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.849 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.849 187156 DEBUG oslo_concurrency.lockutils [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.849 187156 DEBUG nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:33 np0005539504 nova_compute[187152]: 2025-11-29 07:02:33.850 187156 WARNING nova.compute.manager [req-9ffdca2b-ac1a-49c7-8db1-630fd4f5e937 req-39a2e3aa-9751-4d50-b6ee-905110382499 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received unexpected event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with vm_state rescued and task_state None.#033[00m
Nov 29 02:02:34 np0005539504 nova_compute[187152]: 2025-11-29 07:02:34.658 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:37 np0005539504 nova_compute[187152]: 2025-11-29 07:02:37.099 187156 DEBUG nova.compute.manager [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:37 np0005539504 nova_compute[187152]: 2025-11-29 07:02:37.099 187156 DEBUG nova.compute.manager [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing instance network info cache due to event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:02:37 np0005539504 nova_compute[187152]: 2025-11-29 07:02:37.100 187156 DEBUG oslo_concurrency.lockutils [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:37 np0005539504 nova_compute[187152]: 2025-11-29 07:02:37.100 187156 DEBUG oslo_concurrency.lockutils [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:37 np0005539504 nova_compute[187152]: 2025-11-29 07:02:37.100 187156 DEBUG nova.network.neutron [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:02:37 np0005539504 podman[223250]: 2025-11-29 07:02:37.727245828 +0000 UTC m=+0.068592574 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Nov 29 02:02:37 np0005539504 podman[223249]: 2025-11-29 07:02:37.738829391 +0000 UTC m=+0.081573155 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:02:38 np0005539504 nova_compute[187152]: 2025-11-29 07:02:38.114 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.190 187156 DEBUG nova.compute.manager [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.191 187156 DEBUG nova.compute.manager [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing instance network info cache due to event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.191 187156 DEBUG oslo_concurrency.lockutils [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.427 187156 DEBUG nova.network.neutron [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updated VIF entry in instance network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.427 187156 DEBUG nova.network.neutron [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.456 187156 DEBUG oslo_concurrency.lockutils [req-f36e0c34-7062-4f4a-ba1c-98f068d5ddb9 req-911e99f2-c2aa-4c07-9244-297e6081a834 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.457 187156 DEBUG oslo_concurrency.lockutils [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.457 187156 DEBUG nova.network.neutron [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:02:39 np0005539504 nova_compute[187152]: 2025-11-29 07:02:39.661 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:41 np0005539504 nova_compute[187152]: 2025-11-29 07:02:41.615 187156 DEBUG nova.network.neutron [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updated VIF entry in instance network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:02:41 np0005539504 nova_compute[187152]: 2025-11-29 07:02:41.616 187156 DEBUG nova.network.neutron [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:41 np0005539504 nova_compute[187152]: 2025-11-29 07:02:41.632 187156 DEBUG oslo_concurrency.lockutils [req-3d04a158-ca22-4788-85e0-631f6bcebcf4 req-59d653ed-0122-4d7c-97c0-19d8b1460ebf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:41 np0005539504 podman[223292]: 2025-11-29 07:02:41.706019252 +0000 UTC m=+0.050231588 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:02:41 np0005539504 podman[223293]: 2025-11-29 07:02:41.781592704 +0000 UTC m=+0.119007847 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.372 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:42 np0005539504 NetworkManager[55210]: <info>  [1764399762.3732] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Nov 29 02:02:42 np0005539504 NetworkManager[55210]: <info>  [1764399762.3738] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.463 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:42 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:42Z|00194|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.484 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.718 187156 DEBUG nova.compute.manager [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.719 187156 DEBUG nova.compute.manager [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing instance network info cache due to event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.719 187156 DEBUG oslo_concurrency.lockutils [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.719 187156 DEBUG oslo_concurrency.lockutils [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:42 np0005539504 nova_compute[187152]: 2025-11-29 07:02:42.719 187156 DEBUG nova.network.neutron [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:02:43 np0005539504 nova_compute[187152]: 2025-11-29 07:02:43.063 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:43 np0005539504 nova_compute[187152]: 2025-11-29 07:02:43.117 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:44 np0005539504 nova_compute[187152]: 2025-11-29 07:02:44.701 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:44 np0005539504 nova_compute[187152]: 2025-11-29 07:02:44.736 187156 DEBUG nova.network.neutron [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updated VIF entry in instance network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:02:44 np0005539504 nova_compute[187152]: 2025-11-29 07:02:44.736 187156 DEBUG nova.network.neutron [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:44 np0005539504 nova_compute[187152]: 2025-11-29 07:02:44.920 187156 DEBUG oslo_concurrency.lockutils [req-b51b0d48-b4dd-4fff-ae58-91e1311de84a req-02c85956-17df-42fc-8e19-ce0f695059af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:44 np0005539504 nova_compute[187152]: 2025-11-29 07:02:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:46 np0005539504 nova_compute[187152]: 2025-11-29 07:02:46.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.052 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.115 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 63f8497a-eaf6-45ec-a251-92e7903aa297 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.115 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 8f92b94f-11a8-44de-b605-397f29484586 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.116 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.116 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.116 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "8f92b94f-11a8-44de-b605-397f29484586" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.118 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.156 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.161 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "8f92b94f-11a8-44de-b605-397f29484586" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.168 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.246 187156 DEBUG nova.compute.manager [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.247 187156 DEBUG nova.compute.manager [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing instance network info cache due to event network-changed-414eaffc-4b41-4a38-b224-37a122d319f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.247 187156 DEBUG oslo_concurrency.lockutils [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.247 187156 DEBUG oslo_concurrency.lockutils [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:47 np0005539504 nova_compute[187152]: 2025-11-29 07:02:47.247 187156 DEBUG nova.network.neutron [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Refreshing network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.962 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8f92b94f-11a8-44de-b605-397f29484586', 'name': 'tempest-ServersAdminTestJSON-server-1913994908', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '80b4126e17a14d73b40158a57f19d091', 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'hostId': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.965 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'name': 'tempest-ServersAdminTestJSON-server-684406623', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '80b4126e17a14d73b40158a57f19d091', 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'hostId': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.967 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '633b56d013a1402db852e38aa4180fda', 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'hostId': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.970 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8f92b94f-11a8-44de-b605-397f29484586 / tap8b71ee8e-ab inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.970 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.973 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 63f8497a-eaf6-45ec-a251-92e7903aa297 / tap6f4282c7-12 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.973 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.975 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5f13f55a-3763-4482-9248-7c3a4cf2e40d / tap414eaffc-4b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.975 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fbc8465-c689-4335-a24a-d570ce54fec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:47.968264', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '699f3fd4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': 'e23472c5eeac2bb175a18c9839ea9008f712837eac3eb7b8e2fa5b3e635bb397'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:47.968264', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '699fa064-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': 'ea92053286cd68dbf8dbff91c470bd1cb21b82f988753d769f54376deb474d43'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:47.968264', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '699ff956-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': 'd007b2a23ee770d858d757ae6152dcfc24dca2eb240b26073e7982f5db5781ed'}]}, 'timestamp': '2025-11-29 07:02:47.976429', '_unique_id': '959f38e3521846139b094e0860b50509'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.978 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.980 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.981 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.981 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1efc8363-fb7c-48f3-bee1-5954eb1a92ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:47.980734', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69a0b31e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': 'fbe9adbe2230d7c9dc236a861649b62d2a3808d941309ab2dbb2be81ab6f23a6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:47.980734', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69a0be54-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': '2d1da099328f0b6dc430f1f07e7958863f634b10983bb00ae620c3a95dab9805'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:47.980734', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69a0c944-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': '0ff4294c5ab20cfb99e59f76b83370fd080226b276c4c086a763e4f85d94cc5d'}]}, 'timestamp': '2025-11-29 07:02:47.981609', '_unique_id': '3380e18138074af1a3ad1a03f9b42874'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.982 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.996 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:47.997 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.002 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.003 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.007 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.007 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.026 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.027 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.028 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa95e0d0-ece7-4edc-a872-f9f8e82ac378', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a32324-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.918573312, 'message_signature': '871574c4fd9147d98f181296b39cf2b15933e3b9461ca2d39071b951ae0c11c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a3301c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.918573312, 'message_signature': '47def8b6d370f6d0e89e24fc8bb5fb3bebf2fb71463c373b824911dcf26428ca'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a4be00-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.932050097, 'message_signature': '7a7d7e120c80c9e5c204421c51aa05bafffd5f0df3d297f0b805b2dbf0c45f78'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a4ca26-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.932050097, 'message_signature': '888dcf06d5869a20ac708f6244a5c666111409e130b047ac106a5f19142a9a61'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69a7cc1c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': 'f672334580f3652dbcf963c5cb455fe2a2ce93cbeeb7a99424c1a5db10e551e0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: message_id': '69a7dc48-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': 'e50362f165887b1832ec18e348b8d66f9566b1e152fad2cbfdc30764ef23051c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:47.983882', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69a7e986-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': '2d44a8908f1122b6971b9f8478cd0d29f646080870f59918be9084c8a6a9660d'}]}, 'timestamp': '2025-11-29 07:02:48.028364', '_unique_id': '6dd7efa44e71447a96fa2805c591ab73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.031 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.031 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.031 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.032 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.incoming.packets volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.032 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.incoming.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b3d75d2-4b96-4fbc-b491-89ddd59ec9df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.031925', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69a884cc-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': '5083ab320e7cc4e80061258b54696af5e37285a950fde8696a53626a789a87af'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 19, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.031925', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69a894b2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': 'c36e0668713827d8f29ae6ec43e710fccf3197ae788c0aca8aebd6c0668f9748'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.031925', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69a89f66-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': 'fc050e5374b3c8a881e74b595657dfccf4c97ef511a90afc32cb7cf028fe2c78'}]}, 'timestamp': '2025-11-29 07:02:48.032967', '_unique_id': '6b4cfd88b078483cb1e6f0624770d9a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.065 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.065 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.089 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.090 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.133 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.134 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.bytes volume: 135168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.134 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f2ad73f-8437-451e-a245-7750182d1b7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69ada52e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': '202eec7651bbf40c426f5787e53e6df13d3bac28144adc0f268a7331d86c7e3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69adb280-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'ee8a7ab90177e03d53f4e9c1289d8efac0ec2e7728aa4c1e6b3d8bead556e20b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b15016-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': 'f3f869ff8b5490b44a5193ea6c1dd367b3e53c3fd98d06e850c9a58b73efebee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b16524-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '310477df15f56b07e03b4b0551a6c26bbc9aa295198c6657f25fd2e8905c6137'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b81b26-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '44a26335651712781b0a353e9708bc4ec18653d341f0bf0c32b96332f952bed7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135168, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: eral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '69b82a30-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '14815bec80abc60c6921597c9af16fa0eb98c3b62de9f0726be5829d63b80a9d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.035272', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b83700-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'f33e7c47a394eea8e7f5a1460223c4f0c4b5bafa3302a546a6c656e7a9c7101a'}]}, 'timestamp': '2025-11-29 07:02:48.135161', '_unique_id': 'e8e473100df2497bbab0ff18ada1a4e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.154 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.155 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/memory.usage volume: 42.59765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.171 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/memory.usage volume: 42.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.187 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/memory.usage volume: 40.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dff41fd1-c4b7-4783-8b25-b533fdfd518a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59765625, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586', 'timestamp': '2025-11-29T07:02:48.137749', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '69bb5f98-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.089754317, 'message_signature': '65a67acf08759d441d446efa551e897274eace50487db9a41f8cfdc0e2a2c147'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76953125, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'timestamp': '2025-11-29T07:02:48.137749', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '69bdd534-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.105933284, 'message_signature': 'dec724497b37cca62e1b01de262e777cbfed4511d9d6dec99862242c9f5954a8'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40625, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'timestamp': '2025-11-29T07:02:48.137749', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '69c04dd2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.122181243, 'message_signature': 'f259113b085ab786ac36f35af295c53c6d7357f875af3a14e47e32ad878272cb'}]}, 'timestamp': '2025-11-29 07:02:48.188230', '_unique_id': 'd3dd024fc39c4032a9c396c09ad9ece5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.190 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.read.bytes volume: 30435840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.191 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.191 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.read.bytes volume: 30513664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.191 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.191 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.bytes volume: 28287488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.192 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.bytes volume: 8269824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.192 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85d294d3-c6d5-4568-b366-9d59d62c4cd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30435840, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c0bd4e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'ce360eaa1811205d98a4246266fe890ce482e9a6043a2242309c5e1f67f7ba05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c0c802-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'e8b3d313c176be25a6338af747a4b9bfa32186f76318d9690da6efa751104f05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30513664, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c0d248-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '9268788aa927b98d5761d99ff162f0f9e4b1c493e617c4f842e1d6ebc16dc6a9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c0dc84-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '88ac1bab051f53a2ff8648259a87c5b15f5a89e39f6ae231e02169ca3fc1214b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28287488, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c0e72e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'c72c01fff25aa5e9220b9a3d7ffd8607bf963f48653eb7d715d410ddbbb9ab98'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8269824, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_re
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: ': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '69c0f098-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'd3c9989891f002f652866b5daea43616c38aca63625a0e500ed28b33b0cb171b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.190719', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c0fa7a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'c648db1b781044e449785f9187e467aac637a293a5420643e13180f0058f6eb8'}]}, 'timestamp': '2025-11-29 07:02:48.192579', '_unique_id': '0d0f6fd394154180a69aa30ab9c118cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.195 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.195 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.195 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.outgoing.bytes volume: 858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9fb5808-4088-40e8-a7a8-85b03a06f332', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.194975', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c163de-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': 'ccf00af4c8bfacb2b070bd663d7a7283b94a65cbafc820aebfe1234fca0ad1a7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.194975', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c16f50-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': '9095fed4b359dae4f3df065ca001ae3fffba771e36efc41f836794cc7a17e3e8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 858, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.194975', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c1790a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': 'e567c23c04392cc6ff576d26e790e261f38c5f44d35c72753db7cf9d1e4c672e'}]}, 'timestamp': '2025-11-29 07:02:48.195821', '_unique_id': '5e6a123364664f36881a6054ee818f14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.197 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.197 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.198 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.198 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.198 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.read.latency volume: 205015962 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.198 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.read.latency volume: 26168005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.199 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.read.latency volume: 219010808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.199 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.read.latency volume: 17832204 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.199 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.latency volume: 161034161 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.199 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.latency volume: 127373604 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.200 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.latency volume: 21600837 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0864df14-5777-471a-bf95-8179ce949d67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205015962, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c1ed40-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': '6664bdf49dc88bff9ae748853ba1f5ff432f141844a6fe8d83a1c33c87a1f20e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26168005, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c1f7b8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': '51f6f7c7858b58e8b5d66f22741428be0884e3e89f9f8356bab2387a7fe6944c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 219010808, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c20104-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '98b23b9470541bce07b35fe7c3b7d398ddec6e436539828f9adf39eec22edad2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17832204, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c20b22-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '9be01dd9ed96227fb10ab0387d0e0dd96fe68d6973519f79d08eb53fe31b610b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 161034161, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c2146e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '46b16dc50d5cce944cf4666ed74be17e44ee39777f77b3ad544411687085e327'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 127373604, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-84
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '69c21d7e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '94e564f06eed7c9bd330768b58276cb3022ab20bac9c1c76f72b186697c156b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21600837, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.198510', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c226d4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '8a8604fccdfa2899b142bd4acc46d3e5cafa84fdbff2a324a15adad7c1d239f1'}]}, 'timestamp': '2025-11-29 07:02:48.200267', '_unique_id': '5fad754239d648dba69b67c149b1c430'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.202 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.write.latency volume: 9284147793 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.202 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.202 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.write.latency volume: 4520087687 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.203 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.203 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.203 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.latency volume: 6801767 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.203 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b900cdd-2d47-471e-9cdf-f95853b447a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9284147793, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c2823c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'db6687ce4f471dc58b3ece97b5b6df376880d5e4ff71e3699766d5d6ed39ca2a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c28cf0-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': '05eaf297fd51e96101b4f45aeec9dcea1fa19e0a0a8323ca447a19aa94c1e972'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4520087687, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c2965a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '1699cfbb6dc9778d0bbad6b05672f04e41e74986b9058478736f95cffb9212c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c29fba-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '4c4b420c29f2d20929a826ab5b213ee2eea5a134dd40438276e8b498c2c3a102'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c2ac44-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'bd835ab1fe5a670bb5531bfc4f412581a63100fc7044f7c5ee82f0cbb0cf9b9d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6801767, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79'
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 8, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '69c2b82e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '18d64655ec59e39c37eae6b40cd57a522eebf5cee0749e3ee099d66a19f5c789'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.202297', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c2c1ca-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'd587c56a7ca85c5570d666a13a669aafde1c94c4c5a5b0b07618a4c0d97b2102'}]}, 'timestamp': '2025-11-29 07:02:48.204229', '_unique_id': '26654e529b7d493386cb9233d8240381'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.206 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.206 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1913994908>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-684406623>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-108965523>]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.207 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.207 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.207 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.208 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.208 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.208 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.209 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00235de5-6cb5-4511-8aaf-8a6f8e6ff4f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c33aa6-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.918573312, 'message_signature': '595fa328a1cfc56eb253631f9122d145014071c9d7ecd36ccf76bc7d4886cfc0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c348e8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.918573312, 'message_signature': 'a074e0e549427a648acde31b3890e6d79be29756ac2501ca34680bc88e0e0b99'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c35504-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.932050097, 'message_signature': '5a851e091fdcb461f6d89cb4aa0a8da84ac66ef0d95545fb1a98800d5de6e2b2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c361e8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.932050097, 'message_signature': '98ee130e6ad5595d399645a9745f4baf925618f07e62c5dab619ab0c25d9917f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c36f9e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': 'c9cba2e871409aca3d0be08e4bcf7c3a1532170951c62c08dc0ff35e04f63d07'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: : 1, 'disk_name': 'vdb'}, 'message_id': '69c37c96-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': 'c596ad4c905dceb02a1c469543484e3a8f2b591207004c3fe43d1c3f2acd8468'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.207006', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c388e4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': '379c3dcc23bd224248a846b0d096070b6841b2c8715dc9b68237641d6f38963a'}]}, 'timestamp': '2025-11-29 07:02:48.209369', '_unique_id': 'bc9ee77292f345369eac992983626438'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.212 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.212 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.213 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.outgoing.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3080ef78-6f69-4309-b047-050952f72299', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.212262', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c40a12-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': 'f1e6502425a53cfc87f1f979f541704f062461df2b858b8b75595749a07410b3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.212262', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c418ea-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': '77bc1efc3e280b96b603047c5392133b515ca8174cb3b8b44d1bd0f43319f62d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.212262', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c425ce-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': '8044b0076b76b7cc7addb8e410adf307b87c5a53d14f31c4bd6195d2f17da43e'}]}, 'timestamp': '2025-11-29 07:02:48.213395', '_unique_id': '6527ce7490144ea1934b96a854d0b595'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.215 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.216 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.216 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f243752-7796-49dc-a819-e53c6f48d6c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.215917', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c496ee-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': 'a6ecd6c72d6a41cfdccabce148a90bc737c947809a849e98928a81c4d50946b3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.215917', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c4a440-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': 'ed8a449e82da25b49756b8f05fd8ea3299c7886e2e43b1c93c5bb575950b2ebb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.215917', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c4b21e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': '062d97a3fca6bde10bee66bb28d97318065117275295f0617a763c2117157e86'}]}, 'timestamp': '2025-11-29 07:02:48.216985', '_unique_id': 'ab0255cbe39141b09de848af4735b68c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.219 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.219 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.incoming.bytes volume: 1772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.219 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.incoming.bytes volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a52827db-aefa-486f-8295-838d3b0eed30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.219239', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c5183a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': '1335eec0109eb54cbfc5f8c78beebfb25943a81cb2c14846582cbf78550f8f0f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1772, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.219239', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c52348-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': 'cf58760c5cea835be81ede38246168e8eeb74c45a5b2c78c42b6e1cf883b9b4a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.219239', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c52ce4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': 'd4a576a33bb6dfdbb807b61a6595f517f1cb5246cffb612ea04317166d9c6715'}]}, 'timestamp': '2025-11-29 07:02:48.220090', '_unique_id': 'f8eca87beba54e4a8fcc72649d9f2d98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.222 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.222 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.222 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27b28f82-1718-4ee1-8274-ce5ee06ccdc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.222064', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c584c8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': '691f0da89662916e40c6e9a251017974bb49361a268569a7cd5dab2315b78795'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.222064', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c58fd6-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': 'e222e963373307e08ae78b0a2b4fb9468100b1c35f1321af057a777e80110480'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.222064', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c5997c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': '99850c0502b899b4b2f8f5bab41354ac5db24d003570a5c6082955b459d3e626'}]}, 'timestamp': '2025-11-29 07:02:48.222864', '_unique_id': '0b2cc272768a497fbe32e70931b405ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.029 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.224 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.224 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.225 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a65801d-4bc5-4651-bdd5-e0a62edd38b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.224486', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c5e35a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': '328e67b4f9d739bb87a80b9bb42c4dc59271920450a7e2fa3b8b54462335ecdd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.224486', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c5ed32-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': '5b08eaeff129ea38ef3446c43816d0ff5d619fb25bc22502b27c91d661b39b58'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.224486', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c5f732-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': '997d6f2c89df7592f51196b24e046bbea81ae8365496c05684ef646f07fb7425'}]}, 'timestamp': '2025-11-29 07:02:48.225315', '_unique_id': '715a20bd2bad49729ec1f212ca763686'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.227 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/cpu volume: 12830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.227 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/cpu volume: 12620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.227 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/cpu volume: 12050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f3767bb-30ee-4664-84b9-4447d8cae303', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12830000000, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586', 'timestamp': '2025-11-29T07:02:48.227177', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69c64d9a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.089754317, 'message_signature': '2c6a94b6341059d1e462948ad92f4f15a642964dcd7568dbb38844134f1a5e28'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12620000000, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'timestamp': '2025-11-29T07:02:48.227177', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69c65baa-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.105933284, 'message_signature': '156d140ba7253ba5e6f89575f045a01591049085694371cb0c3f049435b48571'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12050000000, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'timestamp': '2025-11-29T07:02:48.227177', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69c66744-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.122181243, 'message_signature': 'e3849903b0d76f2c6155b8ef39e9ac7b0546450a02f62676b7128f82935d379a'}]}, 'timestamp': '2025-11-29 07:02:48.228157', '_unique_id': 'c1cb3c6a745940518d3960b745468b72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.230 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.230 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.230 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.read.requests volume: 1102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.230 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.231 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.requests volume: 960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.231 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.requests volume: 450 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.231 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad0a5e51-ac36-41c2-864e-36528759b3bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c6bf32-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': '521bb7a579d556c8e6eea344e6fd98a83f17bc28f661aceb88dbde3a225a1041'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c6ca18-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'b856125a4c5298ce8493163570f9fa99c715503da545054d7d608163d39f726e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1102, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c6d35a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': 'd3a0fbd6b08401caa13610e8d8d1a8d27482c34b3cb0b86ef7e49fa353bbfc00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c6dc4c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': '18ca97a43d7338b3a2de95f00f44f2133953aef43c1f53a9816f3da80a08b70a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 960, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c6e584-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'b714e264e760665c1d8ab5cc71364dd44ac5f94326b2a9487b52e11421c68be5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 450, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: : 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '69c6efde-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'c2d3858ebdf088a9c2dadf5c2f0a7f7c212e8a9eb4daa2dce848c52f17928c40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.230081', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c6f8f8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'd4a16cacd32bf68a6ddee3529d5da0a2d1578e0e53c06d07bdeae82e341f05ae'}]}, 'timestamp': '2025-11-29 07:02:48.231863', '_unique_id': '83d6aeca8b7e47c6bf2d3627c2f55399'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.234 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.write.requests volume: 325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.234 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.234 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.234 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.235 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.235 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.235 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7279e87-6b46-49ea-8a70-f167c32c578a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 325, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c757e4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'adb57f844b512797f15fd9504075646f8661ffa7a22add829519e18746767ef8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c762f2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.969977021, 'message_signature': 'ae808283d2eae3ec84a36239b5b84afae8ed9ee63619bb3850a70f03b9fadcf9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c76d24-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': 'f0055b35f57bedeb98984727c0c38cbc53b4f830e04fd54e2f6920db7974bec3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c77666-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.000903367, 'message_signature': 'd99d579f3d4ceabebbaa8ced3bbdea917a2225b0103958f7278e70db6e57f340'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c77f80-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '87c939a0aab4077e585ed6a0dffc55f97e43d605432ce199708c20b012c5df16'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-8
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: , 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '69c78a52-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': 'c2d1a24967671ccbf8ba1156c647e40fbf095e8c5de0604c5e6ab879cdc00093'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.234003', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c7939e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5231.025205563, 'message_signature': '72c929ef23efebdba76920168cbca22a73040947bd53c29a7272bcff0e00176c'}]}, 'timestamp': '2025-11-29 07:02:48.235821', '_unique_id': 'b490df1fa7734f67ab8f9158f4ae615c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.238 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.238 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.238 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.238 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.239 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.239 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.239 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e61479d9-492b-4814-864a-cfedd155a01b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-vda', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c7f9d8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.918573312, 'message_signature': '3159f9a5f6c22d1e69a76570585a6ba161e109c425abb432e01fca6d38820286'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '8f92b94f-11a8-44de-b605-397f29484586-sda', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'instance-0000003a', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c805fe-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.918573312, 'message_signature': '18c5043f488464245e350a1083d5ae0dfbb700e2f5b40bfa93d4da4cbb32f356'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-vda', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c80fcc-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.932050097, 'message_signature': 'd59e08c5b6450ba0a2df6805502bf5fa51bcb4d03ae5efd9fdbaf69076b5dd51'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '63f8497a-eaf6-45ec-a251-92e7903aa297-sda', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'instance-00000036', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c81936-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.932050097, 'message_signature': '1b6bc9fa22dc2af1b166979973f4608ca628e7ad6c99a9d1163adc12cc40b426'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vda', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c823b8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': '0445e7d22bde75b096264441db7eb316fe25437a5ef20ae4e3b72431da0509aa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-vdb', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': '
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: gb': 1, 'disk_name': 'vdb'}, 'message_id': '69c82d0e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': '2a3fa91458d2fb54678fc1ef2185d9f235b36a766d91fc87ec7d26359098a5a4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d-sda', 'timestamp': '2025-11-29T07:02:48.238149', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'instance-0000003c', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c83614-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.942501748, 'message_signature': 'd15b6ccd1711faed1698a1e7881ded68e0f2469a44941ef57afc3adeeffd4e0d'}]}, 'timestamp': '2025-11-29 07:02:48.239979', '_unique_id': '0f5f2681e131417ca8179f550ae4a9b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.242 12 DEBUG ceilometer.compute.pollsters [-] 8f92b94f-11a8-44de-b605-397f29484586/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.242 12 DEBUG ceilometer.compute.pollsters [-] 63f8497a-eaf6-45ec-a251-92e7903aa297/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.242 12 DEBUG ceilometer.compute.pollsters [-] 5f13f55a-3763-4482-9248-7c3a4cf2e40d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6feaeff-93c6-4c13-8d82-d4668ad3f97a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-0000003a-8f92b94f-11a8-44de-b605-397f29484586-tap8b71ee8e-ab', 'timestamp': '2025-11-29T07:02:48.242105', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1913994908', 'name': 'tap8b71ee8e-ab', 'instance_id': '8f92b94f-11a8-44de-b605-397f29484586', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:47:74:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b71ee8e-ab'}, 'message_id': '69c89474-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.90294859, 'message_signature': '2279b8eae1d81c4d482ff4d6e363c4a183fd3182b5408b2eb2855d1e3fc99712'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000036-63f8497a-eaf6-45ec-a251-92e7903aa297-tap6f4282c7-12', 'timestamp': '2025-11-29T07:02:48.242105', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-684406623', 'name': 'tap6f4282c7-12', 'instance_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'instance_type': 'm1.nano', 'host': '56fd35b39793119b2a6e96bba8711e50e145442ebb8df1b601955994', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:5f:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6f4282c7-12'}, 'message_id': '69c89fe6-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.906361492, 'message_signature': '003eb60f3ea8db87495f5a6cc075a576cb3f7fa639407c357e92e93fa17bbdb7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6bde44f29f11433fbe4f716d75f87b96', 'user_name': None, 'project_id': '633b56d013a1402db852e38aa4180fda', 'project_name': None, 'resource_id': 'instance-0000003c-5f13f55a-3763-4482-9248-7c3a4cf2e40d-tap414eaffc-4b', 'timestamp': '2025-11-29T07:02:48.242105', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-108965523', 'name': 'tap414eaffc-4b', 'instance_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'instance_type': 'm1.nano', 'host': '7dee81dcd85aa986fd6daae8b213ece4ce4b42492cd838e0405511dc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:64:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap414eaffc-4b'}, 'message_id': '69c8a9a0-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5230.908674995, 'message_signature': '450ed8341e552e8547cb609352e7e3713e0eec14091f651309cd8ade81ea72dd'}]}, 'timestamp': '2025-11-29 07:02:48.242943', '_unique_id': '41d177ba950843b8bb1527bd0bff92b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:02:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.236 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:02:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:02:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:48.532 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.533 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:48.534 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:02:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:48Z|00195|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.719 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.774 187156 DEBUG nova.network.neutron [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updated VIF entry in instance network info cache for port 414eaffc-4b41-4a38-b224-37a122d319f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.775 187156 DEBUG nova.network.neutron [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [{"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.796 187156 DEBUG oslo_concurrency.lockutils [req-80a72866-e678-4fee-a68f-c206a7cd767d req-195811c3-ba79-48ab-8a27-9adf3aa3e43d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f13f55a-3763-4482-9248-7c3a4cf2e40d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:48Z|00196|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:02:48 np0005539504 nova_compute[187152]: 2025-11-29 07:02:48.876 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:49 np0005539504 nova_compute[187152]: 2025-11-29 07:02:49.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:49 np0005539504 podman[223361]: 2025-11-29 07:02:49.727425916 +0000 UTC m=+0.062078778 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 29 02:02:50 np0005539504 nova_compute[187152]: 2025-11-29 07:02:50.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:50 np0005539504 nova_compute[187152]: 2025-11-29 07:02:50.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:02:51 np0005539504 nova_compute[187152]: 2025-11-29 07:02:51.140 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:02:51 np0005539504 nova_compute[187152]: 2025-11-29 07:02:51.140 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:02:51 np0005539504 nova_compute[187152]: 2025-11-29 07:02:51.141 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.935 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.935 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.936 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.936 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.936 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.949 187156 INFO nova.compute.manager [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Terminating instance#033[00m
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.961 187156 DEBUG nova.compute.manager [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:02:52 np0005539504 kernel: tap414eaffc-4b (unregistering): left promiscuous mode
Nov 29 02:02:52 np0005539504 NetworkManager[55210]: <info>  [1764399772.9845] device (tap414eaffc-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:02:52 np0005539504 nova_compute[187152]: 2025-11-29 07:02:52.993 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:52Z|00197|binding|INFO|Releasing lport 414eaffc-4b41-4a38-b224-37a122d319f3 from this chassis (sb_readonly=0)
Nov 29 02:02:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:52Z|00198|binding|INFO|Setting lport 414eaffc-4b41-4a38-b224-37a122d319f3 down in Southbound
Nov 29 02:02:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:02:52Z|00199|binding|INFO|Removing iface tap414eaffc-4b ovn-installed in OVS
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.004 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Nov 29 02:02:53 np0005539504 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003c.scope: Consumed 13.317s CPU time.
Nov 29 02:02:53 np0005539504 systemd-machined[153423]: Machine qemu-31-instance-0000003c terminated.
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.179 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.184 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:53.193 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:4a:7f 10.100.0.9'], port_security=['fa:16:3e:64:4a:7f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f13f55a-3763-4482-9248-7c3a4cf2e40d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a037afc-c347-4059-964e-c318838fb9e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '633b56d013a1402db852e38aa4180fda', 'neutron:revision_number': '8', 'neutron:security_group_ids': '28fbae92-3ce1-46eb-a136-c9b63071c210', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=464f2dc0-d054-4bcf-a076-ac8d2e495428, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=414eaffc-4b41-4a38-b224-37a122d319f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:53.194 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 414eaffc-4b41-4a38-b224-37a122d319f3 in datapath 6a037afc-c347-4059-964e-c318838fb9e7 unbound from our chassis#033[00m
Nov 29 02:02:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:53.195 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a037afc-c347-4059-964e-c318838fb9e7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:02:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:53.198 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e7b5b4f8-9ce5-48a0-af1a-8247eff02ab0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.242 187156 INFO nova.virt.libvirt.driver [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Instance destroyed successfully.#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.244 187156 DEBUG nova.objects.instance [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lazy-loading 'resources' on Instance uuid 5f13f55a-3763-4482-9248-7c3a4cf2e40d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.292 187156 DEBUG nova.virt.libvirt.vif [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-108965523',display_name='tempest-ServerRescueTestJSONUnderV235-server-108965523',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-108965523',id=60,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:02:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='633b56d013a1402db852e38aa4180fda',ramdisk_id='',reservation_id='r-9sepmc2z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-433902945',owner_user_name='tempest-ServerRescueTestJSONUnderV235-433902945-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:02:32Z,user_data=None,user_id='6bde44f29f11433fbe4f716d75f87b96',uuid=5f13f55a-3763-4482-9248-7c3a4cf2e40d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.293 187156 DEBUG nova.network.os_vif_util [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converting VIF {"id": "414eaffc-4b41-4a38-b224-37a122d319f3", "address": "fa:16:3e:64:4a:7f", "network": {"id": "6a037afc-c347-4059-964e-c318838fb9e7", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1081519173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "633b56d013a1402db852e38aa4180fda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap414eaffc-4b", "ovs_interfaceid": "414eaffc-4b41-4a38-b224-37a122d319f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.293 187156 DEBUG nova.network.os_vif_util [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.294 187156 DEBUG os_vif [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.296 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.296 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap414eaffc-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.298 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.299 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.304 187156 INFO os_vif [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=414eaffc-4b41-4a38-b224-37a122d319f3,network=Network(6a037afc-c347-4059-964e-c318838fb9e7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap414eaffc-4b')#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.304 187156 INFO nova.virt.libvirt.driver [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Deleting instance files /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d_del#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.305 187156 INFO nova.virt.libvirt.driver [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Deletion of /var/lib/nova/instances/5f13f55a-3763-4482-9248-7c3a4cf2e40d_del complete#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.449 187156 INFO nova.compute.manager [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Took 0.49 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.450 187156 DEBUG oslo.service.loopingcall [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.450 187156 DEBUG nova.compute.manager [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.450 187156 DEBUG nova.network.neutron [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.703 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updating instance_info_cache with network_info: [{"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.719 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-8f92b94f-11a8-44de-b605-397f29484586" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.720 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.720 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.720 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.947 187156 DEBUG nova.compute.manager [req-0664c381-0b09-4a0e-aa0e-b32792544b23 req-88e6b6f0-5608-4640-abc3-adce879dd650 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-unplugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.948 187156 DEBUG oslo_concurrency.lockutils [req-0664c381-0b09-4a0e-aa0e-b32792544b23 req-88e6b6f0-5608-4640-abc3-adce879dd650 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.948 187156 DEBUG oslo_concurrency.lockutils [req-0664c381-0b09-4a0e-aa0e-b32792544b23 req-88e6b6f0-5608-4640-abc3-adce879dd650 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.948 187156 DEBUG oslo_concurrency.lockutils [req-0664c381-0b09-4a0e-aa0e-b32792544b23 req-88e6b6f0-5608-4640-abc3-adce879dd650 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.948 187156 DEBUG nova.compute.manager [req-0664c381-0b09-4a0e-aa0e-b32792544b23 req-88e6b6f0-5608-4640-abc3-adce879dd650 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-unplugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:53 np0005539504 nova_compute[187152]: 2025-11-29 07:02:53.949 187156 DEBUG nova.compute.manager [req-0664c381-0b09-4a0e-aa0e-b32792544b23 req-88e6b6f0-5608-4640-abc3-adce879dd650 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-unplugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:02:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:02:54.536 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:54 np0005539504 nova_compute[187152]: 2025-11-29 07:02:54.705 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:54 np0005539504 podman[223407]: 2025-11-29 07:02:54.72707279 +0000 UTC m=+0.061173704 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:02:54 np0005539504 nova_compute[187152]: 2025-11-29 07:02:54.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:54 np0005539504 nova_compute[187152]: 2025-11-29 07:02:54.960 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:54 np0005539504 nova_compute[187152]: 2025-11-29 07:02:54.960 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:54 np0005539504 nova_compute[187152]: 2025-11-29 07:02:54.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:54 np0005539504 nova_compute[187152]: 2025-11-29 07:02:54.961 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.795 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.855 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.856 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.903 187156 DEBUG nova.network.neutron [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.912 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.917 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.969 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.970 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.989 187156 INFO nova.compute.manager [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Took 2.54 seconds to deallocate network for instance.#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.991 187156 DEBUG nova.compute.manager [req-ce8f27ac-73cb-4b76-ab27-70fea6a0f88b req-f6045731-493e-4291-9e12-0da28b56b3b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-deleted-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.992 187156 INFO nova.compute.manager [req-ce8f27ac-73cb-4b76-ab27-70fea6a0f88b req-f6045731-493e-4291-9e12-0da28b56b3b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Neutron deleted interface 414eaffc-4b41-4a38-b224-37a122d319f3; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:02:55 np0005539504 nova_compute[187152]: 2025-11-29 07:02:55.992 187156 DEBUG nova.network.neutron [req-ce8f27ac-73cb-4b76-ab27-70fea6a0f88b req-f6045731-493e-4291-9e12-0da28b56b3b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.024 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.064 187156 DEBUG nova.compute.manager [req-ce8f27ac-73cb-4b76-ab27-70fea6a0f88b req-f6045731-493e-4291-9e12-0da28b56b3b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Detach interface failed, port_id=414eaffc-4b41-4a38-b224-37a122d319f3, reason: Instance 5f13f55a-3763-4482-9248-7c3a4cf2e40d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.073 187156 DEBUG nova.compute.manager [req-8ab5cedb-6609-4767-bd07-338d7f021973 req-80ae3456-76e0-4f9f-a99c-ddf8bf532098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.073 187156 DEBUG oslo_concurrency.lockutils [req-8ab5cedb-6609-4767-bd07-338d7f021973 req-80ae3456-76e0-4f9f-a99c-ddf8bf532098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.074 187156 DEBUG oslo_concurrency.lockutils [req-8ab5cedb-6609-4767-bd07-338d7f021973 req-80ae3456-76e0-4f9f-a99c-ddf8bf532098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.074 187156 DEBUG oslo_concurrency.lockutils [req-8ab5cedb-6609-4767-bd07-338d7f021973 req-80ae3456-76e0-4f9f-a99c-ddf8bf532098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.074 187156 DEBUG nova.compute.manager [req-8ab5cedb-6609-4767-bd07-338d7f021973 req-80ae3456-76e0-4f9f-a99c-ddf8bf532098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] No waiting events found dispatching network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.075 187156 WARNING nova.compute.manager [req-8ab5cedb-6609-4767-bd07-338d7f021973 req-80ae3456-76e0-4f9f-a99c-ddf8bf532098 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Received unexpected event network-vif-plugged-414eaffc-4b41-4a38-b224-37a122d319f3 for instance with vm_state rescued and task_state deleting.#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.187 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.189 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5385MB free_disk=73.14356231689453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.189 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.189 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.321 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.403 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 63f8497a-eaf6-45ec-a251-92e7903aa297 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.404 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 8f92b94f-11a8-44de-b605-397f29484586 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.404 187156 WARNING nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 5f13f55a-3763-4482-9248-7c3a4cf2e40d is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.404 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.405 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.492 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.509 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.538 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.539 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.539 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.558 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:56 np0005539504 nova_compute[187152]: 2025-11-29 07:02:56.993 187156 INFO nova.scheduler.client.report [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Deleted allocations for instance 5f13f55a-3763-4482-9248-7c3a4cf2e40d#033[00m
Nov 29 02:02:57 np0005539504 nova_compute[187152]: 2025-11-29 07:02:57.084 187156 DEBUG oslo_concurrency.lockutils [None req-85f44b92-770b-4952-82a4-df2f0025e555 6bde44f29f11433fbe4f716d75f87b96 633b56d013a1402db852e38aa4180fda - - default default] Lock "5f13f55a-3763-4482-9248-7c3a4cf2e40d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:58 np0005539504 nova_compute[187152]: 2025-11-29 07:02:58.300 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:59 np0005539504 nova_compute[187152]: 2025-11-29 07:02:59.707 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:03 np0005539504 nova_compute[187152]: 2025-11-29 07:03:03.303 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:04 np0005539504 nova_compute[187152]: 2025-11-29 07:03:04.710 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:04 np0005539504 podman[223441]: 2025-11-29 07:03:04.74676679 +0000 UTC m=+0.065606284 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:03:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:05Z|00200|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:03:05 np0005539504 nova_compute[187152]: 2025-11-29 07:03:05.181 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:08 np0005539504 nova_compute[187152]: 2025-11-29 07:03:08.242 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399773.2396586, 5f13f55a-3763-4482-9248-7c3a4cf2e40d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:08 np0005539504 nova_compute[187152]: 2025-11-29 07:03:08.242 187156 INFO nova.compute.manager [-] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:03:08 np0005539504 nova_compute[187152]: 2025-11-29 07:03:08.264 187156 DEBUG nova.compute.manager [None req-e506b7bf-8b06-4e91-b4c7-49b6ddfb2f7d - - - - - -] [instance: 5f13f55a-3763-4482-9248-7c3a4cf2e40d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:08 np0005539504 nova_compute[187152]: 2025-11-29 07:03:08.307 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:08 np0005539504 podman[223460]: 2025-11-29 07:03:08.719480731 +0000 UTC m=+0.056579820 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:03:08 np0005539504 podman[223461]: 2025-11-29 07:03:08.721221188 +0000 UTC m=+0.057607057 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, vcs-type=git)
Nov 29 02:03:09 np0005539504 nova_compute[187152]: 2025-11-29 07:03:09.711 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:12 np0005539504 podman[223503]: 2025-11-29 07:03:12.717991468 +0000 UTC m=+0.049807366 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:03:12 np0005539504 podman[223504]: 2025-11-29 07:03:12.761596416 +0000 UTC m=+0.088922483 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.226 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.227 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.259 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.391 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.392 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.403 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.404 187156 INFO nova.compute.claims [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.571 187156 DEBUG nova.compute.provider_tree [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.590 187156 DEBUG nova.scheduler.client.report [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.615 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.616 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.676 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.677 187156 DEBUG nova.network.neutron [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.710 187156 INFO nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.731 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.855 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.857 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.858 187156 INFO nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Creating image(s)#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.859 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.859 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.860 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.875 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.932 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.933 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.934 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.948 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:13 np0005539504 nova_compute[187152]: 2025-11-29 07:03:13.966 187156 DEBUG nova.policy [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9e3e9e61ce2488b9054c4600ce9414e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8dcf86bfd19147f7bdf78ae3ae8da3dc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.001 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.001 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.030 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.032 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.032 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.083 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.084 187156 DEBUG nova.virt.disk.api [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Checking if we can resize image /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.084 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.141 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.142 187156 DEBUG nova.virt.disk.api [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Cannot resize image /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.142 187156 DEBUG nova.objects.instance [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'migration_context' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.156 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.157 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Ensure instance console log exists: /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.157 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.158 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.158 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:14 np0005539504 nova_compute[187152]: 2025-11-29 07:03:14.714 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539504 nova_compute[187152]: 2025-11-29 07:03:15.149 187156 DEBUG nova.network.neutron [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Successfully created port: d46aa91e-6bad-43f6-8140-e425e73f9e24 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:03:16 np0005539504 nova_compute[187152]: 2025-11-29 07:03:16.858 187156 DEBUG nova.network.neutron [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Successfully updated port: d46aa91e-6bad-43f6-8140-e425e73f9e24 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:03:16 np0005539504 nova_compute[187152]: 2025-11-29 07:03:16.878 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:03:16 np0005539504 nova_compute[187152]: 2025-11-29 07:03:16.879 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquired lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:03:16 np0005539504 nova_compute[187152]: 2025-11-29 07:03:16.879 187156 DEBUG nova.network.neutron [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:03:17 np0005539504 nova_compute[187152]: 2025-11-29 07:03:17.108 187156 DEBUG nova.network.neutron [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:03:17 np0005539504 nova_compute[187152]: 2025-11-29 07:03:17.646 187156 DEBUG nova.compute.manager [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received event network-changed-d46aa91e-6bad-43f6-8140-e425e73f9e24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:17 np0005539504 nova_compute[187152]: 2025-11-29 07:03:17.647 187156 DEBUG nova.compute.manager [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Refreshing instance network info cache due to event network-changed-d46aa91e-6bad-43f6-8140-e425e73f9e24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:03:17 np0005539504 nova_compute[187152]: 2025-11-29 07:03:17.647 187156 DEBUG oslo_concurrency.lockutils [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.246 187156 DEBUG nova.network.neutron [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Updating instance_info_cache with network_info: [{"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.314 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.453 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Releasing lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.454 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance network_info: |[{"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.454 187156 DEBUG oslo_concurrency.lockutils [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.455 187156 DEBUG nova.network.neutron [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Refreshing network info cache for port d46aa91e-6bad-43f6-8140-e425e73f9e24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.458 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Start _get_guest_xml network_info=[{"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.462 187156 WARNING nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.467 187156 DEBUG nova.virt.libvirt.host [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.468 187156 DEBUG nova.virt.libvirt.host [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.473 187156 DEBUG nova.virt.libvirt.host [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.474 187156 DEBUG nova.virt.libvirt.host [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.475 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.476 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.476 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.476 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.477 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.477 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.477 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.477 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.478 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.478 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.478 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.478 187156 DEBUG nova.virt.hardware [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.483 187156 DEBUG nova.virt.libvirt.vif [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1476174431',display_name='tempest-InstanceActionsTestJSON-server-1476174431',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1476174431',id=62,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8dcf86bfd19147f7bdf78ae3ae8da3dc',ramdisk_id='',reservation_id='r-0z0sylhn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-2017952800',owner_user_name='tempest-InstanceActionsTestJSON-2017952800-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:13Z,user_data=None,user_id='e9e3e9e61ce2488b9054c4600ce9414e',uuid=de7b880e-4675-4881-a025-e6663ea2477f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.483 187156 DEBUG nova.network.os_vif_util [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converting VIF {"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.484 187156 DEBUG nova.network.os_vif_util [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.485 187156 DEBUG nova.objects.instance [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'pci_devices' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.503 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <uuid>de7b880e-4675-4881-a025-e6663ea2477f</uuid>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <name>instance-0000003e</name>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:name>tempest-InstanceActionsTestJSON-server-1476174431</nova:name>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:03:18</nova:creationTime>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:user uuid="e9e3e9e61ce2488b9054c4600ce9414e">tempest-InstanceActionsTestJSON-2017952800-project-member</nova:user>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:project uuid="8dcf86bfd19147f7bdf78ae3ae8da3dc">tempest-InstanceActionsTestJSON-2017952800</nova:project>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        <nova:port uuid="d46aa91e-6bad-43f6-8140-e425e73f9e24">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <entry name="serial">de7b880e-4675-4881-a025-e6663ea2477f</entry>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <entry name="uuid">de7b880e-4675-4881-a025-e6663ea2477f</entry>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:7b:44:f1"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <target dev="tapd46aa91e-6b"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/console.log" append="off"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:03:18 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:03:18 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:03:18 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:03:18 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.505 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Preparing to wait for external event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.505 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.505 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.506 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.507 187156 DEBUG nova.virt.libvirt.vif [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1476174431',display_name='tempest-InstanceActionsTestJSON-server-1476174431',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1476174431',id=62,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8dcf86bfd19147f7bdf78ae3ae8da3dc',ramdisk_id='',reservation_id='r-0z0sylhn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-2017952800',owner_user_name='tempest-InstanceActionsTestJSON-2017952800-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:13Z,user_data=None,user_id='e9e3e9e61ce2488b9054c4600ce9414e',uuid=de7b880e-4675-4881-a025-e6663ea2477f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.507 187156 DEBUG nova.network.os_vif_util [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converting VIF {"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.508 187156 DEBUG nova.network.os_vif_util [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.508 187156 DEBUG os_vif [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.509 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.509 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.510 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.513 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.514 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd46aa91e-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.514 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd46aa91e-6b, col_values=(('external_ids', {'iface-id': 'd46aa91e-6bad-43f6-8140-e425e73f9e24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:44:f1', 'vm-uuid': 'de7b880e-4675-4881-a025-e6663ea2477f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539504 NetworkManager[55210]: <info>  [1764399798.5176] manager: (tapd46aa91e-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.518 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.524 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539504 nova_compute[187152]: 2025-11-29 07:03:18.526 187156 INFO os_vif [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b')#033[00m
Nov 29 02:03:19 np0005539504 nova_compute[187152]: 2025-11-29 07:03:19.175 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:03:19 np0005539504 nova_compute[187152]: 2025-11-29 07:03:19.177 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:03:19 np0005539504 nova_compute[187152]: 2025-11-29 07:03:19.177 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] No VIF found with MAC fa:16:3e:7b:44:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:03:19 np0005539504 nova_compute[187152]: 2025-11-29 07:03:19.178 187156 INFO nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Using config drive#033[00m
Nov 29 02:03:19 np0005539504 nova_compute[187152]: 2025-11-29 07:03:19.761 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:20 np0005539504 podman[223567]: 2025-11-29 07:03:20.730339855 +0000 UTC m=+0.067599767 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.652 187156 INFO nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Creating config drive at /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config#033[00m
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.659 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj9j8n7pf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.790 187156 DEBUG oslo_concurrency.processutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj9j8n7pf" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:21 np0005539504 kernel: tapd46aa91e-6b: entered promiscuous mode
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.873 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:21Z|00201|binding|INFO|Claiming lport d46aa91e-6bad-43f6-8140-e425e73f9e24 for this chassis.
Nov 29 02:03:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:21Z|00202|binding|INFO|d46aa91e-6bad-43f6-8140-e425e73f9e24: Claiming fa:16:3e:7b:44:f1 10.100.0.4
Nov 29 02:03:21 np0005539504 NetworkManager[55210]: <info>  [1764399801.8757] manager: (tapd46aa91e-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.878 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:21 np0005539504 systemd-udevd[223604]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:03:21 np0005539504 NetworkManager[55210]: <info>  [1764399801.9239] device (tapd46aa91e-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:03:21 np0005539504 NetworkManager[55210]: <info>  [1764399801.9250] device (tapd46aa91e-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:03:21 np0005539504 systemd-machined[153423]: New machine qemu-32-instance-0000003e.
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.930 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:21Z|00203|binding|INFO|Setting lport d46aa91e-6bad-43f6-8140-e425e73f9e24 ovn-installed in OVS
Nov 29 02:03:21 np0005539504 nova_compute[187152]: 2025-11-29 07:03:21.935 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:21 np0005539504 systemd[1]: Started Virtual Machine qemu-32-instance-0000003e.
Nov 29 02:03:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:21Z|00204|binding|INFO|Setting lport d46aa91e-6bad-43f6-8140-e425e73f9e24 up in Southbound
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.946 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:44:f1 10.100.0.4'], port_security=['fa:16:3e:7b:44:f1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'de7b880e-4675-4881-a025-e6663ea2477f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8dcf86bfd19147f7bdf78ae3ae8da3dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b84a64cd-1934-4e00-978d-310080bf5c18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be9674f6-9c2a-4960-9c2d-8409ecc561c9, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=d46aa91e-6bad-43f6-8140-e425e73f9e24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.948 104164 INFO neutron.agent.ovn.metadata.agent [-] Port d46aa91e-6bad-43f6-8140-e425e73f9e24 in datapath 89b1e105-b8a0-4492-b6a7-459a78b991cf bound to our chassis#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.951 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89b1e105-b8a0-4492-b6a7-459a78b991cf#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.965 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c06e53b1-e7e0-4807-8dab-e95fdb9fe4cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.966 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89b1e105-b1 in ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.970 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89b1e105-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.970 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd4b752-8672-4671-a7fe-8a99ee7db2fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.971 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b26866b5-ffd2-4d94-9311-2f5fb564e010]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:21.988 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3e47beef-e5f0-4eeb-ae5c-0500e97f3e97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.006 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce868645-1e88-47be-b091-bdf976a953ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.048 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1b012f9f-45e9-4488-83af-5048380f6fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 NetworkManager[55210]: <info>  [1764399802.0582] manager: (tap89b1e105-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.057 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[72f94150-94f0-4934-bdc7-5ff62ff8e0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.093 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d6efff6c-cd6b-4723-9382-4955d87c78db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.097 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfc6638-6ed4-47f8-b33f-3b4b15237167]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 NetworkManager[55210]: <info>  [1764399802.1243] device (tap89b1e105-b0): carrier: link connected
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.129 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[63b0970b-a819-484c-9027-41722ac50ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.160 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96c8fcf5-2f39-49eb-a288-29e36d290bb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89b1e105-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:c6:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526500, 'reachable_time': 25764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223638, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.184 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[56446467-74f2-4504-8f5b-2f15e64fead2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:c6c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526500, 'tstamp': 526500}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223639, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.206 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9597f95a-578e-41b8-86dc-b118ad9fcfa7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89b1e105-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:c6:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526500, 'reachable_time': 25764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223640, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.244 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[398f0f7e-d9b0-4f17-936e-75ce595db14d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.311 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[be2d6427-4103-4db2-a328-5df24f8e75d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.313 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89b1e105-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.314 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.314 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89b1e105-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:22 np0005539504 nova_compute[187152]: 2025-11-29 07:03:22.317 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:22 np0005539504 NetworkManager[55210]: <info>  [1764399802.3177] manager: (tap89b1e105-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Nov 29 02:03:22 np0005539504 kernel: tap89b1e105-b0: entered promiscuous mode
Nov 29 02:03:22 np0005539504 nova_compute[187152]: 2025-11-29 07:03:22.319 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.325 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89b1e105-b0, col_values=(('external_ids', {'iface-id': '00b5991a-a160-4f28-9e01-9845268b548e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:22 np0005539504 nova_compute[187152]: 2025-11-29 07:03:22.327 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:22Z|00205|binding|INFO|Releasing lport 00b5991a-a160-4f28-9e01-9845268b548e from this chassis (sb_readonly=0)
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.331 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89b1e105-b8a0-4492-b6a7-459a78b991cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89b1e105-b8a0-4492-b6a7-459a78b991cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.336 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[26b8a253-d8af-4a76-a6d7-cc6d8ce63502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.337 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-89b1e105-b8a0-4492-b6a7-459a78b991cf
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/89b1e105-b8a0-4492-b6a7-459a78b991cf.pid.haproxy
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 89b1e105-b8a0-4492-b6a7-459a78b991cf
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:03:22 np0005539504 nova_compute[187152]: 2025-11-29 07:03:22.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.340 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'env', 'PROCESS_TAG=haproxy-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89b1e105-b8a0-4492-b6a7-459a78b991cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:03:22 np0005539504 podman[223671]: 2025-11-29 07:03:22.788063729 +0000 UTC m=+0.065364267 container create 12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:03:22 np0005539504 systemd[1]: Started libpod-conmon-12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d.scope.
Nov 29 02:03:22 np0005539504 nova_compute[187152]: 2025-11-29 07:03:22.824 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399802.8233175, de7b880e-4675-4881-a025-e6663ea2477f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:22 np0005539504 nova_compute[187152]: 2025-11-29 07:03:22.825 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] VM Started (Lifecycle Event)#033[00m
Nov 29 02:03:22 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:03:22 np0005539504 podman[223671]: 2025-11-29 07:03:22.757829402 +0000 UTC m=+0.035129960 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:03:22 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42505bfb5b005d050bee5813d4cea4ed5e87c365abf45fa68ffee848c8d04d7f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:03:22 np0005539504 podman[223671]: 2025-11-29 07:03:22.86397668 +0000 UTC m=+0.141277238 container init 12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:03:22 np0005539504 podman[223671]: 2025-11-29 07:03:22.86879506 +0000 UTC m=+0.146095598 container start 12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:03:22 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [NOTICE]   (223697) : New worker (223699) forked
Nov 29 02:03:22 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [NOTICE]   (223697) : Loading success.
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.917 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.918 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:22.919 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.438 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.444 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399802.8242123, de7b880e-4675-4881-a025-e6663ea2477f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.444 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.489 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.493 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.518 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:23 np0005539504 nova_compute[187152]: 2025-11-29 07:03:23.535 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:03:24 np0005539504 nova_compute[187152]: 2025-11-29 07:03:24.388 187156 DEBUG nova.network.neutron [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Updated VIF entry in instance network info cache for port d46aa91e-6bad-43f6-8140-e425e73f9e24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:03:24 np0005539504 nova_compute[187152]: 2025-11-29 07:03:24.389 187156 DEBUG nova.network.neutron [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Updating instance_info_cache with network_info: [{"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:24 np0005539504 nova_compute[187152]: 2025-11-29 07:03:24.412 187156 DEBUG oslo_concurrency.lockutils [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:03:24 np0005539504 nova_compute[187152]: 2025-11-29 07:03:24.764 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.528 187156 DEBUG nova.compute.manager [req-46b34ffe-6dd6-4d1a-aca8-005572d9b499 req-27ac3267-debb-4e76-a714-d9e02b068b9e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.529 187156 DEBUG oslo_concurrency.lockutils [req-46b34ffe-6dd6-4d1a-aca8-005572d9b499 req-27ac3267-debb-4e76-a714-d9e02b068b9e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.529 187156 DEBUG oslo_concurrency.lockutils [req-46b34ffe-6dd6-4d1a-aca8-005572d9b499 req-27ac3267-debb-4e76-a714-d9e02b068b9e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.529 187156 DEBUG oslo_concurrency.lockutils [req-46b34ffe-6dd6-4d1a-aca8-005572d9b499 req-27ac3267-debb-4e76-a714-d9e02b068b9e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.530 187156 DEBUG nova.compute.manager [req-46b34ffe-6dd6-4d1a-aca8-005572d9b499 req-27ac3267-debb-4e76-a714-d9e02b068b9e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Processing event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.530 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.536 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399805.5354092, de7b880e-4675-4881-a025-e6663ea2477f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.536 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.539 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.544 187156 INFO nova.virt.libvirt.driver [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance spawned successfully.#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.545 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.560 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.567 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.578 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.579 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.579 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.580 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.580 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.580 187156 DEBUG nova.virt.libvirt.driver [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.587 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.660 187156 INFO nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Took 11.80 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.661 187156 DEBUG nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:25.722 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:25.723 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:03:25 np0005539504 podman[223708]: 2025-11-29 07:03:25.733608947 +0000 UTC m=+0.073752983 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.766 187156 INFO nova.compute.manager [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Took 12.42 seconds to build instance.#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.783 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:25 np0005539504 nova_compute[187152]: 2025-11-29 07:03:25.796 187156 DEBUG oslo_concurrency.lockutils [None req-9a62c3e5-ec54-43fb-8723-c568fc4dee06 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.660 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.661 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.661 187156 INFO nova.compute.manager [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Rebooting instance#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.683 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.684 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquired lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.684 187156 DEBUG nova.network.neutron [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.707 187156 DEBUG nova.compute.manager [req-2bc3e491-7c85-467b-8b30-4e4ec0d9b9ea req-985bf219-a99b-45ce-8792-f42d3d60f660 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.708 187156 DEBUG oslo_concurrency.lockutils [req-2bc3e491-7c85-467b-8b30-4e4ec0d9b9ea req-985bf219-a99b-45ce-8792-f42d3d60f660 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.709 187156 DEBUG oslo_concurrency.lockutils [req-2bc3e491-7c85-467b-8b30-4e4ec0d9b9ea req-985bf219-a99b-45ce-8792-f42d3d60f660 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.709 187156 DEBUG oslo_concurrency.lockutils [req-2bc3e491-7c85-467b-8b30-4e4ec0d9b9ea req-985bf219-a99b-45ce-8792-f42d3d60f660 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.709 187156 DEBUG nova.compute.manager [req-2bc3e491-7c85-467b-8b30-4e4ec0d9b9ea req-985bf219-a99b-45ce-8792-f42d3d60f660 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] No waiting events found dispatching network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:27 np0005539504 nova_compute[187152]: 2025-11-29 07:03:27.710 187156 WARNING nova.compute.manager [req-2bc3e491-7c85-467b-8b30-4e4ec0d9b9ea req-985bf219-a99b-45ce-8792-f42d3d60f660 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received unexpected event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 for instance with vm_state active and task_state rebooting_hard.#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.369 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.370 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.370 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.371 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.371 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.389 187156 INFO nova.compute.manager [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Terminating instance#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.404 187156 DEBUG nova.compute.manager [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:03:28 np0005539504 kernel: tap8b71ee8e-ab (unregistering): left promiscuous mode
Nov 29 02:03:28 np0005539504 NetworkManager[55210]: <info>  [1764399808.4330] device (tap8b71ee8e-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00206|binding|INFO|Releasing lport 8b71ee8e-ab95-47c7-a203-015aac168d4f from this chassis (sb_readonly=0)
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00207|binding|INFO|Setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f down in Southbound
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00208|binding|INFO|Removing iface tap8b71ee8e-ab ovn-installed in OVS
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.442 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.449 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:74:16 10.100.0.6'], port_security=['fa:16:3e:47:74:16 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8f92b94f-11a8-44de-b605-397f29484586', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=8b71ee8e-ab95-47c7-a203-015aac168d4f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.451 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 8b71ee8e-ab95-47c7-a203-015aac168d4f in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.454 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.456 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.472 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[42ef2ee0-2753-467a-b5bd-b5f60a0f4940]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Nov 29 02:03:28 np0005539504 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003a.scope: Consumed 18.794s CPU time.
Nov 29 02:03:28 np0005539504 systemd-machined[153423]: Machine qemu-29-instance-0000003a terminated.
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.508 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2174ca45-8aca-4dc3-8972-26338044eb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.512 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[52899283-d0ac-48b1-a162-ca2d89c1ffa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.519 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.549 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0a98fb2a-cfd5-4606-8c44-b62690bb4a34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.566 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[415f6a82-577a-45e1-8e44-102c4bfc6f7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512435, 'reachable_time': 30864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223742, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.584 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[afbbc5f1-35e5-451e-96c0-b634263859f7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512447, 'tstamp': 512447}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223743, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512450, 'tstamp': 512450}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223743, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.587 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.590 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.594 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.595 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.595 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.596 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.596 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:28 np0005539504 kernel: tap8b71ee8e-ab: entered promiscuous mode
Nov 29 02:03:28 np0005539504 NetworkManager[55210]: <info>  [1764399808.6269] manager: (tap8b71ee8e-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Nov 29 02:03:28 np0005539504 kernel: tap8b71ee8e-ab (unregistering): left promiscuous mode
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00209|binding|INFO|Claiming lport 8b71ee8e-ab95-47c7-a203-015aac168d4f for this chassis.
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00210|binding|INFO|8b71ee8e-ab95-47c7-a203-015aac168d4f: Claiming fa:16:3e:47:74:16 10.100.0.6
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.642 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:74:16 10.100.0.6'], port_security=['fa:16:3e:47:74:16 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8f92b94f-11a8-44de-b605-397f29484586', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=8b71ee8e-ab95-47c7-a203-015aac168d4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.644 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 8b71ee8e-ab95-47c7-a203-015aac168d4f in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.646 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00211|binding|INFO|Setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f ovn-installed in OVS
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00212|binding|INFO|Setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f up in Southbound
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00213|binding|INFO|Releasing lport 8b71ee8e-ab95-47c7-a203-015aac168d4f from this chassis (sb_readonly=1)
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00214|if_status|INFO|Not setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f down as sb is readonly
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00215|binding|INFO|Removing iface tap8b71ee8e-ab ovn-installed in OVS
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00216|binding|INFO|Releasing lport 8b71ee8e-ab95-47c7-a203-015aac168d4f from this chassis (sb_readonly=0)
Nov 29 02:03:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:28Z|00217|binding|INFO|Setting lport 8b71ee8e-ab95-47c7-a203-015aac168d4f down in Southbound
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.664 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:74:16 10.100.0.6'], port_security=['fa:16:3e:47:74:16 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8f92b94f-11a8-44de-b605-397f29484586', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=8b71ee8e-ab95-47c7-a203-015aac168d4f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.667 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.672 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[16b27d7f-185f-400f-a3fb-7e4fcd56f230]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.689 187156 INFO nova.virt.libvirt.driver [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Instance destroyed successfully.#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.690 187156 DEBUG nova.objects.instance [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'resources' on Instance uuid 8f92b94f-11a8-44de-b605-397f29484586 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.709 187156 DEBUG nova.virt.libvirt.vif [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1913994908',display_name='tempest-ServersAdminTestJSON-server-1913994908',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1913994908',id=58,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-nnaudwbx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:42Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=8f92b94f-11a8-44de-b605-397f29484586,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.710 187156 DEBUG nova.network.os_vif_util [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "address": "fa:16:3e:47:74:16", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b71ee8e-ab", "ovs_interfaceid": "8b71ee8e-ab95-47c7-a203-015aac168d4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.711 187156 DEBUG nova.network.os_vif_util [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.711 187156 DEBUG os_vif [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.713 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.715 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b71ee8e-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.714 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[862d48bc-ebbe-4b2e-a62a-fce4f681617d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.717 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.718 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.719 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0783386a-4a43-45b1-b582-c719c1f25ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.721 187156 INFO os_vif [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:74:16,bridge_name='br-int',has_traffic_filtering=True,id=8b71ee8e-ab95-47c7-a203-015aac168d4f,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b71ee8e-ab')#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.722 187156 INFO nova.virt.libvirt.driver [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Deleting instance files /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586_del#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.723 187156 INFO nova.virt.libvirt.driver [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Deletion of /var/lib/nova/instances/8f92b94f-11a8-44de-b605-397f29484586_del complete#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.754 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0f90c09c-75dd-415e-8de4-353d7c2fe0fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.771 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8d308768-9183-41d8-a281-a508d00673de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512435, 'reachable_time': 30864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223764, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.776 187156 DEBUG nova.compute.manager [req-1889a825-6664-429a-a780-f3af0281a9da req-0f4ce57b-5a84-46da-ad5b-36d122bba377 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-unplugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.777 187156 DEBUG oslo_concurrency.lockutils [req-1889a825-6664-429a-a780-f3af0281a9da req-0f4ce57b-5a84-46da-ad5b-36d122bba377 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.777 187156 DEBUG oslo_concurrency.lockutils [req-1889a825-6664-429a-a780-f3af0281a9da req-0f4ce57b-5a84-46da-ad5b-36d122bba377 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.777 187156 DEBUG oslo_concurrency.lockutils [req-1889a825-6664-429a-a780-f3af0281a9da req-0f4ce57b-5a84-46da-ad5b-36d122bba377 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.777 187156 DEBUG nova.compute.manager [req-1889a825-6664-429a-a780-f3af0281a9da req-0f4ce57b-5a84-46da-ad5b-36d122bba377 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-unplugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.778 187156 DEBUG nova.compute.manager [req-1889a825-6664-429a-a780-f3af0281a9da req-0f4ce57b-5a84-46da-ad5b-36d122bba377 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-unplugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.790 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4fe536-3cad-4c16-9530-1ac06f2a0d75]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512447, 'tstamp': 512447}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223765, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512450, 'tstamp': 512450}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223765, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.792 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.794 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.795 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.796 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.797 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.797 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.799 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 8b71ee8e-ab95-47c7-a203-015aac168d4f in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.801 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.820 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1f10fc01-5e1a-40f0-b745-fbbc20baec1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.856 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ece654da-236c-4c05-a4f8-b681b4456ef9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.862 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[668498d8-72be-43e8-ab10-b4196e09913c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.897 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ff202d42-bee3-4f55-827c-0169df97d841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.915 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dc10facf-808c-4aed-93ee-9715079fe54a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512435, 'reachable_time': 30864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223771, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.934 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[decbc6b3-d8ac-4cc0-bbd7-061bb8b32d34]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512447, 'tstamp': 512447}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223772, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 512450, 'tstamp': 512450}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223772, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.937 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.941 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.942 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.943 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.943 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:28.944 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.991 187156 INFO nova.compute.manager [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.991 187156 DEBUG oslo.service.loopingcall [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.992 187156 DEBUG nova.compute.manager [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:03:28 np0005539504 nova_compute[187152]: 2025-11-29 07:03:28.992 187156 DEBUG nova.network.neutron [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:03:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:29.724 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:29 np0005539504 nova_compute[187152]: 2025-11-29 07:03:29.766 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.631 187156 DEBUG nova.network.neutron [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.655 187156 INFO nova.compute.manager [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Took 1.66 seconds to deallocate network for instance.#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.709 187156 DEBUG nova.compute.manager [req-26201590-1a07-489c-9ad0-51097f5d9fd4 req-842e424f-474c-446a-8c8d-116957bf1dd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-deleted-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.746 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.748 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.782 187156 DEBUG nova.scheduler.client.report [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.802 187156 DEBUG nova.scheduler.client.report [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.803 187156 DEBUG nova.compute.provider_tree [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.820 187156 DEBUG nova.scheduler.client.report [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.858 187156 DEBUG nova.scheduler.client.report [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.898 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.899 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.900 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.900 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.900 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.901 187156 WARNING nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received unexpected event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.901 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.901 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.902 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.902 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.902 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.903 187156 WARNING nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received unexpected event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.903 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.903 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.904 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.904 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.904 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.905 187156 WARNING nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received unexpected event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.905 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-unplugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.905 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.906 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.906 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.906 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-unplugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.907 187156 WARNING nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received unexpected event network-vif-unplugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.907 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.907 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8f92b94f-11a8-44de-b605-397f29484586-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.908 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.908 187156 DEBUG oslo_concurrency.lockutils [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.908 187156 DEBUG nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] No waiting events found dispatching network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.909 187156 WARNING nova.compute.manager [req-29f93eae-9b36-499d-86ff-834570ec1ea2 req-d5e6af04-90d1-42c7-9659-76f42ec4d3c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Received unexpected event network-vif-plugged-8b71ee8e-ab95-47c7-a203-015aac168d4f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.910 187156 DEBUG nova.network.neutron [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Updating instance_info_cache with network_info: [{"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.957 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Releasing lock "refresh_cache-de7b880e-4675-4881-a025-e6663ea2477f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.973 187156 DEBUG nova.compute.provider_tree [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.979 187156 DEBUG nova.compute.manager [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:30 np0005539504 nova_compute[187152]: 2025-11-29 07:03:30.993 187156 DEBUG nova.scheduler.client.report [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.016 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.044 187156 INFO nova.scheduler.client.report [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Deleted allocations for instance 8f92b94f-11a8-44de-b605-397f29484586#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.152 187156 DEBUG oslo_concurrency.lockutils [None req-17f5e79e-a615-4e40-9aae-c879720b671a cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "8f92b94f-11a8-44de-b605-397f29484586" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:31 np0005539504 kernel: tapd46aa91e-6b (unregistering): left promiscuous mode
Nov 29 02:03:31 np0005539504 NetworkManager[55210]: <info>  [1764399811.1820] device (tapd46aa91e-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.189 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00218|binding|INFO|Releasing lport d46aa91e-6bad-43f6-8140-e425e73f9e24 from this chassis (sb_readonly=0)
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00219|binding|INFO|Setting lport d46aa91e-6bad-43f6-8140-e425e73f9e24 down in Southbound
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00220|binding|INFO|Removing iface tapd46aa91e-6b ovn-installed in OVS
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.192 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.200 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:44:f1 10.100.0.4'], port_security=['fa:16:3e:7b:44:f1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'de7b880e-4675-4881-a025-e6663ea2477f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8dcf86bfd19147f7bdf78ae3ae8da3dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b84a64cd-1934-4e00-978d-310080bf5c18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be9674f6-9c2a-4960-9c2d-8409ecc561c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=d46aa91e-6bad-43f6-8140-e425e73f9e24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.201 104164 INFO neutron.agent.ovn.metadata.agent [-] Port d46aa91e-6bad-43f6-8140-e425e73f9e24 in datapath 89b1e105-b8a0-4492-b6a7-459a78b991cf unbound from our chassis#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.202 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89b1e105-b8a0-4492-b6a7-459a78b991cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.203 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d37b69c2-0b4c-4a8a-b5e2-0e18730bc32e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.204 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf namespace which is not needed anymore#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.204 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Nov 29 02:03:31 np0005539504 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000003e.scope: Consumed 6.558s CPU time.
Nov 29 02:03:31 np0005539504 systemd-machined[153423]: Machine qemu-32-instance-0000003e terminated.
Nov 29 02:03:31 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [NOTICE]   (223697) : haproxy version is 2.8.14-c23fe91
Nov 29 02:03:31 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [NOTICE]   (223697) : path to executable is /usr/sbin/haproxy
Nov 29 02:03:31 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [WARNING]  (223697) : Exiting Master process...
Nov 29 02:03:31 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [ALERT]    (223697) : Current worker (223699) exited with code 143 (Terminated)
Nov 29 02:03:31 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223693]: [WARNING]  (223697) : All workers exited. Exiting... (0)
Nov 29 02:03:31 np0005539504 systemd[1]: libpod-12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d.scope: Deactivated successfully.
Nov 29 02:03:31 np0005539504 podman[223794]: 2025-11-29 07:03:31.34506083 +0000 UTC m=+0.046906358 container died 12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:03:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d-userdata-shm.mount: Deactivated successfully.
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.380 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 systemd[1]: var-lib-containers-storage-overlay-42505bfb5b005d050bee5813d4cea4ed5e87c365abf45fa68ffee848c8d04d7f-merged.mount: Deactivated successfully.
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.387 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 podman[223794]: 2025-11-29 07:03:31.397083125 +0000 UTC m=+0.098928633 container cleanup 12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:03:31 np0005539504 systemd[1]: libpod-conmon-12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d.scope: Deactivated successfully.
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.419 187156 INFO nova.virt.libvirt.driver [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance destroyed successfully.#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.420 187156 DEBUG nova.objects.instance [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'resources' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.438 187156 DEBUG nova.virt.libvirt.vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1476174431',display_name='tempest-InstanceActionsTestJSON-server-1476174431',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1476174431',id=62,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8dcf86bfd19147f7bdf78ae3ae8da3dc',ramdisk_id='',reservation_id='r-0z0sylhn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-2017952800',owner_user_name='tempest-InstanceActionsTestJSON-2017952800-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:31Z,user_data=None,user_id='e9e3e9e61ce2488b9054c4600ce9414e',uuid=de7b880e-4675-4881-a025-e6663ea2477f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.439 187156 DEBUG nova.network.os_vif_util [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converting VIF {"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.439 187156 DEBUG nova.network.os_vif_util [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.440 187156 DEBUG os_vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.441 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.441 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd46aa91e-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.445 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.448 187156 INFO os_vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b')#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.453 187156 DEBUG nova.virt.libvirt.driver [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Start _get_guest_xml network_info=[{"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.458 187156 WARNING nova.virt.libvirt.driver [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:03:31 np0005539504 podman[223834]: 2025-11-29 07:03:31.461944107 +0000 UTC m=+0.043193507 container remove 12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.462 187156 DEBUG nova.virt.libvirt.host [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.463 187156 DEBUG nova.virt.libvirt.host [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.466 187156 DEBUG nova.virt.libvirt.host [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.467 187156 DEBUG nova.virt.libvirt.host [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.466 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d18bb01e-4bda-404f-bd44-225d41466933]: (4, ('Sat Nov 29 07:03:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf (12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d)\n12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d\nSat Nov 29 07:03:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf (12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d)\n12ced2c76539be20b049093046cf30949a4ad1a3558d9b3d5ed72de36d90c13d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.468 187156 DEBUG nova.virt.libvirt.driver [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.468 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.468 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.469 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.469 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.469 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7743fb83-fd53-475a-8c3a-e83c3106a118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.469 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.469 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.470 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.470 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89b1e105-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.470 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.470 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.470 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.471 187156 DEBUG nova.virt.hardware [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.471 187156 DEBUG nova.objects.instance [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'vcpu_model' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:31 np0005539504 kernel: tap89b1e105-b0: left promiscuous mode
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.473 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.483 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.486 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4df6226a-f383-4d1c-831b-70b8c961f52f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.490 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.507 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c5d4d1-693e-4e8f-8be0-25449a0abd4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.509 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a03ad5ec-5662-4d9f-adf1-b2614a7e1d92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.525 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c7656b18-5049-4ff0-b827-5489017ab348]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526492, 'reachable_time': 28237, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223847, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 systemd[1]: run-netns-ovnmeta\x2d89b1e105\x2db8a0\x2d4492\x2db6a7\x2d459a78b991cf.mount: Deactivated successfully.
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.532 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.532 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4432ff-340f-44e1-b29a-f9eb1cc4310c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.556 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.557 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.557 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.558 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.559 187156 DEBUG nova.virt.libvirt.vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1476174431',display_name='tempest-InstanceActionsTestJSON-server-1476174431',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1476174431',id=62,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8dcf86bfd19147f7bdf78ae3ae8da3dc',ramdisk_id='',reservation_id='r-0z0sylhn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-2017952800',owner_user_name='tempest-InstanceActionsTestJSON-2017952800-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:31Z,user_data=None,user_id='e9e3e9e61ce2488b9054c4600ce9414e',uuid=de7b880e-4675-4881-a025-e6663ea2477f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.560 187156 DEBUG nova.network.os_vif_util [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converting VIF {"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.561 187156 DEBUG nova.network.os_vif_util [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.562 187156 DEBUG nova.objects.instance [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'pci_devices' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.581 187156 DEBUG nova.virt.libvirt.driver [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <uuid>de7b880e-4675-4881-a025-e6663ea2477f</uuid>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <name>instance-0000003e</name>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:name>tempest-InstanceActionsTestJSON-server-1476174431</nova:name>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:03:31</nova:creationTime>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:user uuid="e9e3e9e61ce2488b9054c4600ce9414e">tempest-InstanceActionsTestJSON-2017952800-project-member</nova:user>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:project uuid="8dcf86bfd19147f7bdf78ae3ae8da3dc">tempest-InstanceActionsTestJSON-2017952800</nova:project>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        <nova:port uuid="d46aa91e-6bad-43f6-8140-e425e73f9e24">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <entry name="serial">de7b880e-4675-4881-a025-e6663ea2477f</entry>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <entry name="uuid">de7b880e-4675-4881-a025-e6663ea2477f</entry>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk.config"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:7b:44:f1"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <target dev="tapd46aa91e-6b"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/console.log" append="off"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:03:31 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:03:31 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:03:31 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:03:31 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.583 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.640 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.642 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.698 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.701 187156 DEBUG nova.objects.instance [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'trusted_certs' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.719 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.776 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.777 187156 DEBUG nova.virt.disk.api [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Checking if we can resize image /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.777 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.829 187156 DEBUG oslo_concurrency.processutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.830 187156 DEBUG nova.virt.disk.api [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Cannot resize image /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.830 187156 DEBUG nova.objects.instance [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'migration_context' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.849 187156 DEBUG nova.virt.libvirt.vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1476174431',display_name='tempest-InstanceActionsTestJSON-server-1476174431',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1476174431',id=62,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='8dcf86bfd19147f7bdf78ae3ae8da3dc',ramdisk_id='',reservation_id='r-0z0sylhn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-2017952800',owner_user_name='tempest-InstanceActionsTestJSON-2017952800-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:31Z,user_data=None,user_id='e9e3e9e61ce2488b9054c4600ce9414e',uuid=de7b880e-4675-4881-a025-e6663ea2477f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.850 187156 DEBUG nova.network.os_vif_util [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converting VIF {"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.851 187156 DEBUG nova.network.os_vif_util [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.851 187156 DEBUG os_vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.852 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.853 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.855 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.855 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd46aa91e-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.856 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd46aa91e-6b, col_values=(('external_ids', {'iface-id': 'd46aa91e-6bad-43f6-8140-e425e73f9e24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:44:f1', 'vm-uuid': 'de7b880e-4675-4881-a025-e6663ea2477f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.857 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 NetworkManager[55210]: <info>  [1764399811.8585] manager: (tapd46aa91e-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.864 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.864 187156 INFO os_vif [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b')#033[00m
Nov 29 02:03:31 np0005539504 kernel: tapd46aa91e-6b: entered promiscuous mode
Nov 29 02:03:31 np0005539504 NetworkManager[55210]: <info>  [1764399811.9331] manager: (tapd46aa91e-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00221|binding|INFO|Claiming lport d46aa91e-6bad-43f6-8140-e425e73f9e24 for this chassis.
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00222|binding|INFO|d46aa91e-6bad-43f6-8140-e425e73f9e24: Claiming fa:16:3e:7b:44:f1 10.100.0.4
Nov 29 02:03:31 np0005539504 systemd-udevd[223849]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:03:31 np0005539504 NetworkManager[55210]: <info>  [1764399811.9442] device (tapd46aa91e-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:03:31 np0005539504 NetworkManager[55210]: <info>  [1764399811.9453] device (tapd46aa91e-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.946 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.947 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:44:f1 10.100.0.4'], port_security=['fa:16:3e:7b:44:f1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'de7b880e-4675-4881-a025-e6663ea2477f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8dcf86bfd19147f7bdf78ae3ae8da3dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b84a64cd-1934-4e00-978d-310080bf5c18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be9674f6-9c2a-4960-9c2d-8409ecc561c9, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=d46aa91e-6bad-43f6-8140-e425e73f9e24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00223|binding|INFO|Setting lport d46aa91e-6bad-43f6-8140-e425e73f9e24 ovn-installed in OVS
Nov 29 02:03:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:31Z|00224|binding|INFO|Setting lport d46aa91e-6bad-43f6-8140-e425e73f9e24 up in Southbound
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.948 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.948 104164 INFO neutron.agent.ovn.metadata.agent [-] Port d46aa91e-6bad-43f6-8140-e425e73f9e24 in datapath 89b1e105-b8a0-4492-b6a7-459a78b991cf bound to our chassis#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.949 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.950 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89b1e105-b8a0-4492-b6a7-459a78b991cf#033[00m
Nov 29 02:03:31 np0005539504 nova_compute[187152]: 2025-11-29 07:03:31.951 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.961 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fc970166-51a1-4a69-b39d-b1daabab508d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.962 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89b1e105-b1 in ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.965 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89b1e105-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.965 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5f8dc3df-cf16-4520-aae8-8ddf7c326a09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.966 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[654a39ca-b360-47d5-a601-d3d484691a5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 systemd-machined[153423]: New machine qemu-33-instance-0000003e.
Nov 29 02:03:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:31.978 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[c36f8880-54ad-4cee-8130-83d3b641dd6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:31 np0005539504 systemd[1]: Started Virtual Machine qemu-33-instance-0000003e.
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.001 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97501047-002e-4249-9b68-8998844c8968]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.027 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa345ea-cefd-46e2-943e-93e94630e700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.034 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[41f0bd25-64fa-45d0-97a6-e6d8134fa5e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 NetworkManager[55210]: <info>  [1764399812.0357] manager: (tap89b1e105-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.072 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6b97d81b-f140-46d1-ab18-e44d0fc745f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.075 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a13f00-9a4c-4555-8962-7d786d9c4b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 NetworkManager[55210]: <info>  [1764399812.0981] device (tap89b1e105-b0): carrier: link connected
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.103 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2b46bf-9674-4a0c-a8ba-b2bf7019f9b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.120 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[33f1dba1-b68c-4b94-a3ba-8f912e77989c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89b1e105-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:c6:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527497, 'reachable_time': 42507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223913, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.135 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[38491069-47c5-4dbb-9684-35db7e154732]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:c6c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527497, 'tstamp': 527497}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223914, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.158 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ae97ce-7579-4b96-8a75-023f3dc3f26b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89b1e105-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:c6:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527497, 'reachable_time': 42507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223915, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.192 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[913b1e72-7d52-4422-a3af-a8e99d46294b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.252 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb30cd3-d5ab-4f86-a2d9-f539258246ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 kernel: tap89b1e105-b0: entered promiscuous mode
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.254 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89b1e105-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.254 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.255 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89b1e105-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.257 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539504 NetworkManager[55210]: <info>  [1764399812.2579] manager: (tap89b1e105-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.259 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89b1e105-b0, col_values=(('external_ids', {'iface-id': '00b5991a-a160-4f28-9e01-9845268b548e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:32Z|00225|binding|INFO|Releasing lport 00b5991a-a160-4f28-9e01-9845268b548e from this chassis (sb_readonly=0)
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.261 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.272 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.273 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89b1e105-b8a0-4492-b6a7-459a78b991cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89b1e105-b8a0-4492-b6a7-459a78b991cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.274 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[71101908-9397-4162-a67c-9d56dbb4ad40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.275 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-89b1e105-b8a0-4492-b6a7-459a78b991cf
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/89b1e105-b8a0-4492-b6a7-459a78b991cf.pid.haproxy
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 89b1e105-b8a0-4492-b6a7-459a78b991cf
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:03:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:32.275 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'env', 'PROCESS_TAG=haproxy-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89b1e105-b8a0-4492-b6a7-459a78b991cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.318 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for de7b880e-4675-4881-a025-e6663ea2477f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.318 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399812.3175187, de7b880e-4675-4881-a025-e6663ea2477f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.318 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.322 187156 DEBUG nova.compute.manager [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.331 187156 INFO nova.virt.libvirt.driver [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance rebooted successfully.#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.332 187156 DEBUG nova.compute.manager [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.344 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.349 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.378 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.379 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399812.3218732, de7b880e-4675-4881-a025-e6663ea2477f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.379 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] VM Started (Lifecycle Event)#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.413 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.417 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:03:32 np0005539504 nova_compute[187152]: 2025-11-29 07:03:32.434 187156 DEBUG oslo_concurrency.lockutils [None req-8767b118-bab2-4bde-a026-2808a5be8597 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:32 np0005539504 podman[223954]: 2025-11-29 07:03:32.672106993 +0000 UTC m=+0.056967740 container create 34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:03:32 np0005539504 systemd[1]: Started libpod-conmon-34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5.scope.
Nov 29 02:03:32 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:03:32 np0005539504 podman[223954]: 2025-11-29 07:03:32.637654012 +0000 UTC m=+0.022514789 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:03:32 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5110ab070a94f1baea2451084c42086e396c2b8d2cb193fe7ab32a8021841734/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:03:32 np0005539504 podman[223954]: 2025-11-29 07:03:32.756038851 +0000 UTC m=+0.140899618 container init 34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:03:32 np0005539504 podman[223954]: 2025-11-29 07:03:32.761498798 +0000 UTC m=+0.146359545 container start 34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:03:32 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [NOTICE]   (223973) : New worker (223975) forked
Nov 29 02:03:32 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [NOTICE]   (223973) : Loading success.
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.176 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.176 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.176 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.177 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.177 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.187 187156 INFO nova.compute.manager [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Terminating instance#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.201 187156 DEBUG nova.compute.manager [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:03:34 np0005539504 kernel: tapd46aa91e-6b (unregistering): left promiscuous mode
Nov 29 02:03:34 np0005539504 NetworkManager[55210]: <info>  [1764399814.2225] device (tapd46aa91e-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:34Z|00226|binding|INFO|Releasing lport d46aa91e-6bad-43f6-8140-e425e73f9e24 from this chassis (sb_readonly=0)
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.231 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:34Z|00227|binding|INFO|Setting lport d46aa91e-6bad-43f6-8140-e425e73f9e24 down in Southbound
Nov 29 02:03:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:34Z|00228|binding|INFO|Removing iface tapd46aa91e-6b ovn-installed in OVS
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.234 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.241 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:44:f1 10.100.0.4'], port_security=['fa:16:3e:7b:44:f1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'de7b880e-4675-4881-a025-e6663ea2477f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8dcf86bfd19147f7bdf78ae3ae8da3dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b84a64cd-1934-4e00-978d-310080bf5c18', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be9674f6-9c2a-4960-9c2d-8409ecc561c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=d46aa91e-6bad-43f6-8140-e425e73f9e24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.242 104164 INFO neutron.agent.ovn.metadata.agent [-] Port d46aa91e-6bad-43f6-8140-e425e73f9e24 in datapath 89b1e105-b8a0-4492-b6a7-459a78b991cf unbound from our chassis#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.244 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89b1e105-b8a0-4492-b6a7-459a78b991cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.245 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.245 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c86eb9a-4cd0-4be0-8e04-ea7cfbbcff26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.246 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf namespace which is not needed anymore#033[00m
Nov 29 02:03:34 np0005539504 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Nov 29 02:03:34 np0005539504 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000003e.scope: Consumed 2.266s CPU time.
Nov 29 02:03:34 np0005539504 systemd-machined[153423]: Machine qemu-33-instance-0000003e terminated.
Nov 29 02:03:34 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [NOTICE]   (223973) : haproxy version is 2.8.14-c23fe91
Nov 29 02:03:34 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [NOTICE]   (223973) : path to executable is /usr/sbin/haproxy
Nov 29 02:03:34 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [WARNING]  (223973) : Exiting Master process...
Nov 29 02:03:34 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [ALERT]    (223973) : Current worker (223975) exited with code 143 (Terminated)
Nov 29 02:03:34 np0005539504 neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf[223969]: [WARNING]  (223973) : All workers exited. Exiting... (0)
Nov 29 02:03:34 np0005539504 systemd[1]: libpod-34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5.scope: Deactivated successfully.
Nov 29 02:03:34 np0005539504 podman[224006]: 2025-11-29 07:03:34.389035899 +0000 UTC m=+0.050937247 container died 34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:03:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5-userdata-shm.mount: Deactivated successfully.
Nov 29 02:03:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay-5110ab070a94f1baea2451084c42086e396c2b8d2cb193fe7ab32a8021841734-merged.mount: Deactivated successfully.
Nov 29 02:03:34 np0005539504 podman[224006]: 2025-11-29 07:03:34.431505226 +0000 UTC m=+0.093406574 container cleanup 34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:03:34 np0005539504 systemd[1]: libpod-conmon-34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5.scope: Deactivated successfully.
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.479 187156 INFO nova.virt.libvirt.driver [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Instance destroyed successfully.#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.480 187156 DEBUG nova.objects.instance [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lazy-loading 'resources' on Instance uuid de7b880e-4675-4881-a025-e6663ea2477f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.498 187156 DEBUG nova.virt.libvirt.vif [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1476174431',display_name='tempest-InstanceActionsTestJSON-server-1476174431',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1476174431',id=62,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8dcf86bfd19147f7bdf78ae3ae8da3dc',ramdisk_id='',reservation_id='r-0z0sylhn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-2017952800',owner_user_name='tempest-InstanceActionsTestJSON-2017952800-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:32Z,user_data=None,user_id='e9e3e9e61ce2488b9054c4600ce9414e',uuid=de7b880e-4675-4881-a025-e6663ea2477f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.499 187156 DEBUG nova.network.os_vif_util [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converting VIF {"id": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "address": "fa:16:3e:7b:44:f1", "network": {"id": "89b1e105-b8a0-4492-b6a7-459a78b991cf", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1232224496-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dcf86bfd19147f7bdf78ae3ae8da3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd46aa91e-6b", "ovs_interfaceid": "d46aa91e-6bad-43f6-8140-e425e73f9e24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.500 187156 DEBUG nova.network.os_vif_util [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.500 187156 DEBUG os_vif [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.502 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.503 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd46aa91e-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.504 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.505 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.507 187156 INFO os_vif [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:44:f1,bridge_name='br-int',has_traffic_filtering=True,id=d46aa91e-6bad-43f6-8140-e425e73f9e24,network=Network(89b1e105-b8a0-4492-b6a7-459a78b991cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd46aa91e-6b')#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.508 187156 INFO nova.virt.libvirt.driver [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Deleting instance files /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f_del#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.509 187156 INFO nova.virt.libvirt.driver [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Deletion of /var/lib/nova/instances/de7b880e-4675-4881-a025-e6663ea2477f_del complete#033[00m
Nov 29 02:03:34 np0005539504 podman[224046]: 2025-11-29 07:03:34.510126831 +0000 UTC m=+0.052126210 container remove 34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.515 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[08c17663-7f35-4d14-a743-4edc71f7db6b]: (4, ('Sat Nov 29 07:03:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf (34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5)\n34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5\nSat Nov 29 07:03:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf (34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5)\n34a74e9aa8c9fc50a137391aab62553e214f866854c2fcd29ec342273576d4e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.517 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcdc29c-3b82-49c6-88d9-b32eebcc6d58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.518 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89b1e105-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.520 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 kernel: tap89b1e105-b0: left promiscuous mode
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.532 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.536 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a3958905-f934-4189-9dcc-70862fef0a41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.549 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ca921c06-98d8-4763-af14-ae4a1be0f852]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.550 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c38dfa2f-bcb6-454d-8aa8-aacd18af213a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.570 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[148033d8-5cbc-451f-b92c-df61e15c5d6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527490, 'reachable_time': 33072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224065, 'error': None, 'target': 'ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 systemd[1]: run-netns-ovnmeta\x2d89b1e105\x2db8a0\x2d4492\x2db6a7\x2d459a78b991cf.mount: Deactivated successfully.
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.574 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89b1e105-b8a0-4492-b6a7-459a78b991cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:03:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:34.575 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[89485825-7e4e-42b6-9a73-d3bfdaf46cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.606 187156 INFO nova.compute.manager [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.607 187156 DEBUG oslo.service.loopingcall [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.607 187156 DEBUG nova.compute.manager [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.607 187156 DEBUG nova.network.neutron [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:03:34 np0005539504 nova_compute[187152]: 2025-11-29 07:03:34.769 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.017 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.018 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.018 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.018 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.018 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.041 187156 INFO nova.compute.manager [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Terminating instance#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.057 187156 DEBUG nova.compute.manager [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:03:35 np0005539504 kernel: tap6f4282c7-12 (unregistering): left promiscuous mode
Nov 29 02:03:35 np0005539504 NetworkManager[55210]: <info>  [1764399815.0830] device (tap6f4282c7-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.091 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:35Z|00229|binding|INFO|Releasing lport 6f4282c7-128e-4a36-ac72-16c3431d2be7 from this chassis (sb_readonly=0)
Nov 29 02:03:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:35Z|00230|binding|INFO|Setting lport 6f4282c7-128e-4a36-ac72-16c3431d2be7 down in Southbound
Nov 29 02:03:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:03:35Z|00231|binding|INFO|Removing iface tap6f4282c7-12 ovn-installed in OVS
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.103 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:5f:f9 10.100.0.5'], port_security=['fa:16:3e:6e:5f:f9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '63f8497a-eaf6-45ec-a251-92e7903aa297', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6f4282c7-128e-4a36-ac72-16c3431d2be7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.104 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6f4282c7-128e-4a36-ac72-16c3431d2be7 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.105 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97f3d85-11c0-4475-aea6-e8da158df42a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.106 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6b94c483-6f85-476b-afb9-500d6554b4a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.107 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a namespace which is not needed anymore#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.114 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000036.scope: Deactivated successfully.
Nov 29 02:03:35 np0005539504 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000036.scope: Consumed 19.201s CPU time.
Nov 29 02:03:35 np0005539504 systemd-machined[153423]: Machine qemu-28-instance-00000036 terminated.
Nov 29 02:03:35 np0005539504 podman[224073]: 2025-11-29 07:03:35.226937276 +0000 UTC m=+0.099513429 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:03:35 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [NOTICE]   (222531) : haproxy version is 2.8.14-c23fe91
Nov 29 02:03:35 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [NOTICE]   (222531) : path to executable is /usr/sbin/haproxy
Nov 29 02:03:35 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [WARNING]  (222531) : Exiting Master process...
Nov 29 02:03:35 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [WARNING]  (222531) : Exiting Master process...
Nov 29 02:03:35 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [ALERT]    (222531) : Current worker (222533) exited with code 143 (Terminated)
Nov 29 02:03:35 np0005539504 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222527]: [WARNING]  (222531) : All workers exited. Exiting... (0)
Nov 29 02:03:35 np0005539504 systemd[1]: libpod-82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f.scope: Deactivated successfully.
Nov 29 02:03:35 np0005539504 podman[224109]: 2025-11-29 07:03:35.273893325 +0000 UTC m=+0.053674561 container died 82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:03:35 np0005539504 NetworkManager[55210]: <info>  [1764399815.2913] manager: (tap6f4282c7-12): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.293 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.301 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:03:35 np0005539504 systemd[1]: var-lib-containers-storage-overlay-3b9e7b4fd292f341fc731243a6fb28ddf112d3f5fde438d5c4753e5efe113521-merged.mount: Deactivated successfully.
Nov 29 02:03:35 np0005539504 podman[224109]: 2025-11-29 07:03:35.329758895 +0000 UTC m=+0.109540121 container cleanup 82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:03:35 np0005539504 systemd[1]: libpod-conmon-82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f.scope: Deactivated successfully.
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.348 187156 INFO nova.virt.libvirt.driver [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Instance destroyed successfully.#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.349 187156 DEBUG nova.objects.instance [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'resources' on Instance uuid 63f8497a-eaf6-45ec-a251-92e7903aa297 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.370 187156 DEBUG nova.virt.libvirt.vif [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-684406623',display_name='tempest-ServersAdminTestJSON-server-684406623',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-684406623',id=54,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-8t85hl1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:02Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=63f8497a-eaf6-45ec-a251-92e7903aa297,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.371 187156 DEBUG nova.network.os_vif_util [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "address": "fa:16:3e:6e:5f:f9", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f4282c7-12", "ovs_interfaceid": "6f4282c7-128e-4a36-ac72-16c3431d2be7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.372 187156 DEBUG nova.network.os_vif_util [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.372 187156 DEBUG os_vif [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.376 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.377 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f4282c7-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.379 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.381 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.385 187156 INFO os_vif [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:5f:f9,bridge_name='br-int',has_traffic_filtering=True,id=6f4282c7-128e-4a36-ac72-16c3431d2be7,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f4282c7-12')#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.386 187156 INFO nova.virt.libvirt.driver [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Deleting instance files /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297_del#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.387 187156 INFO nova.virt.libvirt.driver [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Deletion of /var/lib/nova/instances/63f8497a-eaf6-45ec-a251-92e7903aa297_del complete#033[00m
Nov 29 02:03:35 np0005539504 podman[224153]: 2025-11-29 07:03:35.413580149 +0000 UTC m=+0.055957102 container remove 82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.419 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c897dbcb-36d1-4aca-b7c7-8300434bb325]: (4, ('Sat Nov 29 07:03:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a (82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f)\n82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f\nSat Nov 29 07:03:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a (82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f)\n82b5f5b0db7421412f52b1852660c51e5cc7ee2493627e099ca5e7ad3e897e5f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.421 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5da249-e6df-4711-99cb-3f8dd9701168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.422 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.424 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 kernel: tapb97f3d85-10: left promiscuous mode
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.437 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.440 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1ceb9f9a-0c3a-4229-9e63-2f916377ba5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.467 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce107b86-3fe8-436e-93d7-58395146fda2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.469 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[186b8d00-3d05-4c9a-840b-5bece75f2b0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.469 187156 INFO nova.compute.manager [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.470 187156 DEBUG oslo.service.loopingcall [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.470 187156 DEBUG nova.compute.manager [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.471 187156 DEBUG nova.network.neutron [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.486 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[88b38a4f-0c16-4129-8696-431284448f44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 512427, 'reachable_time': 28644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224167, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.489 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:03:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:03:35.489 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[95d0ac07-5559-44e9-9106-764836f63dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:35 np0005539504 systemd[1]: run-netns-ovnmeta\x2db97f3d85\x2d11c0\x2d4475\x2daea6\x2de8da158df42a.mount: Deactivated successfully.
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.770 187156 DEBUG nova.network.neutron [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.801 187156 INFO nova.compute.manager [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Took 1.19 seconds to deallocate network for instance.#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.864 187156 DEBUG nova.compute.manager [req-754d1237-9c5a-4bd7-9ddd-f0922e950d75 req-f46b14e8-9e63-4544-a441-2ea45bd5b731 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received event network-vif-deleted-d46aa91e-6bad-43f6-8140-e425e73f9e24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.925 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:35 np0005539504 nova_compute[187152]: 2025-11-29 07:03:35.926 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.041 187156 DEBUG nova.compute.provider_tree [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.067 187156 DEBUG nova.scheduler.client.report [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.090 187156 DEBUG nova.compute.manager [req-90c097d7-7948-4833-b9f5-cc692b450151 req-e8df3448-5016-4848-9fe1-2e7b630e0e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.090 187156 DEBUG oslo_concurrency.lockutils [req-90c097d7-7948-4833-b9f5-cc692b450151 req-e8df3448-5016-4848-9fe1-2e7b630e0e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "de7b880e-4675-4881-a025-e6663ea2477f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.090 187156 DEBUG oslo_concurrency.lockutils [req-90c097d7-7948-4833-b9f5-cc692b450151 req-e8df3448-5016-4848-9fe1-2e7b630e0e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.091 187156 DEBUG oslo_concurrency.lockutils [req-90c097d7-7948-4833-b9f5-cc692b450151 req-e8df3448-5016-4848-9fe1-2e7b630e0e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.091 187156 DEBUG nova.compute.manager [req-90c097d7-7948-4833-b9f5-cc692b450151 req-e8df3448-5016-4848-9fe1-2e7b630e0e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] No waiting events found dispatching network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.091 187156 WARNING nova.compute.manager [req-90c097d7-7948-4833-b9f5-cc692b450151 req-e8df3448-5016-4848-9fe1-2e7b630e0e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Received unexpected event network-vif-plugged-d46aa91e-6bad-43f6-8140-e425e73f9e24 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.094 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.128 187156 INFO nova.scheduler.client.report [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Deleted allocations for instance de7b880e-4675-4881-a025-e6663ea2477f#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.170 187156 DEBUG nova.compute.manager [req-bd5fa0d1-79b5-4e9d-9fb7-265790e1a628 req-eb5ceaf9-a9b5-4fe1-a9f0-b66f18d5ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-vif-unplugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.171 187156 DEBUG oslo_concurrency.lockutils [req-bd5fa0d1-79b5-4e9d-9fb7-265790e1a628 req-eb5ceaf9-a9b5-4fe1-a9f0-b66f18d5ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.171 187156 DEBUG oslo_concurrency.lockutils [req-bd5fa0d1-79b5-4e9d-9fb7-265790e1a628 req-eb5ceaf9-a9b5-4fe1-a9f0-b66f18d5ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.171 187156 DEBUG oslo_concurrency.lockutils [req-bd5fa0d1-79b5-4e9d-9fb7-265790e1a628 req-eb5ceaf9-a9b5-4fe1-a9f0-b66f18d5ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.171 187156 DEBUG nova.compute.manager [req-bd5fa0d1-79b5-4e9d-9fb7-265790e1a628 req-eb5ceaf9-a9b5-4fe1-a9f0-b66f18d5ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] No waiting events found dispatching network-vif-unplugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.171 187156 DEBUG nova.compute.manager [req-bd5fa0d1-79b5-4e9d-9fb7-265790e1a628 req-eb5ceaf9-a9b5-4fe1-a9f0-b66f18d5ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-vif-unplugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.232 187156 DEBUG oslo_concurrency.lockutils [None req-fc233b4c-ffa6-4f6c-8f42-cfd0a3a3ea38 e9e3e9e61ce2488b9054c4600ce9414e 8dcf86bfd19147f7bdf78ae3ae8da3dc - - default default] Lock "de7b880e-4675-4881-a025-e6663ea2477f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.832 187156 DEBUG nova.network.neutron [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.854 187156 INFO nova.compute.manager [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Took 1.38 seconds to deallocate network for instance.#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.977 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:36 np0005539504 nova_compute[187152]: 2025-11-29 07:03:36.978 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:37 np0005539504 nova_compute[187152]: 2025-11-29 07:03:37.040 187156 DEBUG nova.compute.provider_tree [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:37 np0005539504 nova_compute[187152]: 2025-11-29 07:03:37.056 187156 DEBUG nova.scheduler.client.report [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:37 np0005539504 nova_compute[187152]: 2025-11-29 07:03:37.088 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:37 np0005539504 nova_compute[187152]: 2025-11-29 07:03:37.110 187156 INFO nova.scheduler.client.report [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Deleted allocations for instance 63f8497a-eaf6-45ec-a251-92e7903aa297#033[00m
Nov 29 02:03:37 np0005539504 nova_compute[187152]: 2025-11-29 07:03:37.219 187156 DEBUG oslo_concurrency.lockutils [None req-ddab1978-aff4-42c5-9edb-53a69acf12aa cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.284 187156 DEBUG nova.compute.manager [req-743971de-a278-4000-b6ff-8fd2b9582672 req-ddf1391a-ea04-4aed-b8e3-4674280b8d2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-vif-deleted-6f4282c7-128e-4a36-ac72-16c3431d2be7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.290 187156 DEBUG nova.compute.manager [req-4b770952-1f93-42ad-94aa-eb25c2663cc5 req-01c31f18-0dce-46b5-b379-3f51931bde23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.291 187156 DEBUG oslo_concurrency.lockutils [req-4b770952-1f93-42ad-94aa-eb25c2663cc5 req-01c31f18-0dce-46b5-b379-3f51931bde23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.291 187156 DEBUG oslo_concurrency.lockutils [req-4b770952-1f93-42ad-94aa-eb25c2663cc5 req-01c31f18-0dce-46b5-b379-3f51931bde23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.291 187156 DEBUG oslo_concurrency.lockutils [req-4b770952-1f93-42ad-94aa-eb25c2663cc5 req-01c31f18-0dce-46b5-b379-3f51931bde23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "63f8497a-eaf6-45ec-a251-92e7903aa297-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.292 187156 DEBUG nova.compute.manager [req-4b770952-1f93-42ad-94aa-eb25c2663cc5 req-01c31f18-0dce-46b5-b379-3f51931bde23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] No waiting events found dispatching network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:38 np0005539504 nova_compute[187152]: 2025-11-29 07:03:38.292 187156 WARNING nova.compute.manager [req-4b770952-1f93-42ad-94aa-eb25c2663cc5 req-01c31f18-0dce-46b5-b379-3f51931bde23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Received unexpected event network-vif-plugged-6f4282c7-128e-4a36-ac72-16c3431d2be7 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:03:39 np0005539504 podman[224168]: 2025-11-29 07:03:39.721976818 +0000 UTC m=+0.061160664 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:03:39 np0005539504 podman[224169]: 2025-11-29 07:03:39.758732161 +0000 UTC m=+0.097312200 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:03:39 np0005539504 nova_compute[187152]: 2025-11-29 07:03:39.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:40 np0005539504 nova_compute[187152]: 2025-11-29 07:03:40.380 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:42 np0005539504 nova_compute[187152]: 2025-11-29 07:03:42.084 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:43 np0005539504 nova_compute[187152]: 2025-11-29 07:03:43.686 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399808.685147, 8f92b94f-11a8-44de-b605-397f29484586 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:43 np0005539504 nova_compute[187152]: 2025-11-29 07:03:43.686 187156 INFO nova.compute.manager [-] [instance: 8f92b94f-11a8-44de-b605-397f29484586] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:03:43 np0005539504 nova_compute[187152]: 2025-11-29 07:03:43.710 187156 DEBUG nova.compute.manager [None req-5aa2e8b1-dc13-4210-b21a-d57cacfe1ed2 - - - - - -] [instance: 8f92b94f-11a8-44de-b605-397f29484586] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:43 np0005539504 podman[224213]: 2025-11-29 07:03:43.712325995 +0000 UTC m=+0.052192441 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:03:43 np0005539504 podman[224214]: 2025-11-29 07:03:43.743510787 +0000 UTC m=+0.077442763 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:03:44 np0005539504 nova_compute[187152]: 2025-11-29 07:03:44.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:45 np0005539504 nova_compute[187152]: 2025-11-29 07:03:45.382 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:45 np0005539504 nova_compute[187152]: 2025-11-29 07:03:45.539 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:45 np0005539504 nova_compute[187152]: 2025-11-29 07:03:45.540 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:46 np0005539504 nova_compute[187152]: 2025-11-29 07:03:46.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:47 np0005539504 nova_compute[187152]: 2025-11-29 07:03:47.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:49 np0005539504 nova_compute[187152]: 2025-11-29 07:03:49.475 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399814.4741683, de7b880e-4675-4881-a025-e6663ea2477f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:49 np0005539504 nova_compute[187152]: 2025-11-29 07:03:49.476 187156 INFO nova.compute.manager [-] [instance: de7b880e-4675-4881-a025-e6663ea2477f] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:03:49 np0005539504 nova_compute[187152]: 2025-11-29 07:03:49.776 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:49 np0005539504 nova_compute[187152]: 2025-11-29 07:03:49.845 187156 DEBUG nova.compute.manager [None req-f0d51045-71c8-448a-ad9d-fbaa2adc0f49 - - - - - -] [instance: de7b880e-4675-4881-a025-e6663ea2477f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:49 np0005539504 nova_compute[187152]: 2025-11-29 07:03:49.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:50 np0005539504 nova_compute[187152]: 2025-11-29 07:03:50.345 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399815.3441715, 63f8497a-eaf6-45ec-a251-92e7903aa297 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:50 np0005539504 nova_compute[187152]: 2025-11-29 07:03:50.346 187156 INFO nova.compute.manager [-] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:03:50 np0005539504 nova_compute[187152]: 2025-11-29 07:03:50.383 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:50 np0005539504 nova_compute[187152]: 2025-11-29 07:03:50.478 187156 DEBUG nova.compute.manager [None req-ccd3dc2c-2d93-4442-b7c5-34932b1474d9 - - - - - -] [instance: 63f8497a-eaf6-45ec-a251-92e7903aa297] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:51 np0005539504 podman[224262]: 2025-11-29 07:03:51.740037559 +0000 UTC m=+0.083227049 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 02:03:51 np0005539504 nova_compute[187152]: 2025-11-29 07:03:51.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:51 np0005539504 nova_compute[187152]: 2025-11-29 07:03:51.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:03:51 np0005539504 nova_compute[187152]: 2025-11-29 07:03:51.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:03:52 np0005539504 nova_compute[187152]: 2025-11-29 07:03:52.075 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:03:53 np0005539504 nova_compute[187152]: 2025-11-29 07:03:53.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:53 np0005539504 nova_compute[187152]: 2025-11-29 07:03:53.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:03:54 np0005539504 nova_compute[187152]: 2025-11-29 07:03:54.708 187156 DEBUG nova.compute.manager [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 02:03:54 np0005539504 nova_compute[187152]: 2025-11-29 07:03:54.777 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:54 np0005539504 nova_compute[187152]: 2025-11-29 07:03:54.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.221 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.320 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.320 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.321 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.321 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.385 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.438 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.439 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.512 187156 DEBUG nova.objects.instance [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_requests' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.541 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.543 187156 INFO nova.compute.claims [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.544 187156 DEBUG nova.objects.instance [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.560 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.561 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5713MB free_disk=73.20154190063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.562 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.566 187156 DEBUG nova.objects.instance [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.626 187156 INFO nova.compute.resource_tracker [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating resource usage from migration b2f30ee9-093d-4a50-9511-730851938837#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.627 187156 DEBUG nova.compute.resource_tracker [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Starting to track incoming migration b2f30ee9-093d-4a50-9511-730851938837 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:03:55 np0005539504 nova_compute[187152]: 2025-11-29 07:03:55.709 187156 DEBUG nova.compute.provider_tree [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.086 187156 DEBUG nova.scheduler.client.report [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.317 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.317 187156 INFO nova.compute.manager [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Migrating#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.324 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.488 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration for instance 690daf8f-6151-4de9-85f6-b8a9fe51ea02 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.645 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating resource usage from migration b2f30ee9-093d-4a50-9511-730851938837#033[00m
Nov 29 02:03:56 np0005539504 nova_compute[187152]: 2025-11-29 07:03:56.645 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Starting to track incoming migration b2f30ee9-093d-4a50-9511-730851938837 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:03:56 np0005539504 podman[224283]: 2025-11-29 07:03:56.715714627 +0000 UTC m=+0.049734024 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 29 02:03:57 np0005539504 nova_compute[187152]: 2025-11-29 07:03:57.272 187156 WARNING nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 690daf8f-6151-4de9-85f6-b8a9fe51ea02 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Nov 29 02:03:57 np0005539504 nova_compute[187152]: 2025-11-29 07:03:57.273 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:03:57 np0005539504 nova_compute[187152]: 2025-11-29 07:03:57.273 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:03:57 np0005539504 nova_compute[187152]: 2025-11-29 07:03:57.329 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:57 np0005539504 nova_compute[187152]: 2025-11-29 07:03:57.648 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:58 np0005539504 nova_compute[187152]: 2025-11-29 07:03:58.131 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:03:58 np0005539504 nova_compute[187152]: 2025-11-29 07:03:58.132 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:58 np0005539504 nova_compute[187152]: 2025-11-29 07:03:58.847 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:59 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:03:59 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:03:59 np0005539504 systemd-logind[783]: New session 52 of user nova.
Nov 29 02:03:59 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:03:59 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:03:59 np0005539504 nova_compute[187152]: 2025-11-29 07:03:59.779 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:59 np0005539504 systemd[224305]: Queued start job for default target Main User Target.
Nov 29 02:03:59 np0005539504 systemd[224305]: Created slice User Application Slice.
Nov 29 02:03:59 np0005539504 systemd[224305]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:03:59 np0005539504 systemd[224305]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:03:59 np0005539504 systemd[224305]: Reached target Paths.
Nov 29 02:03:59 np0005539504 systemd[224305]: Reached target Timers.
Nov 29 02:03:59 np0005539504 systemd[224305]: Starting D-Bus User Message Bus Socket...
Nov 29 02:03:59 np0005539504 systemd[224305]: Starting Create User's Volatile Files and Directories...
Nov 29 02:03:59 np0005539504 systemd[224305]: Finished Create User's Volatile Files and Directories.
Nov 29 02:03:59 np0005539504 systemd[224305]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:03:59 np0005539504 systemd[224305]: Reached target Sockets.
Nov 29 02:03:59 np0005539504 systemd[224305]: Reached target Basic System.
Nov 29 02:03:59 np0005539504 systemd[224305]: Reached target Main User Target.
Nov 29 02:03:59 np0005539504 systemd[224305]: Startup finished in 140ms.
Nov 29 02:03:59 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:03:59 np0005539504 systemd[1]: Started Session 52 of User nova.
Nov 29 02:04:00 np0005539504 systemd-logind[783]: Session 52 logged out. Waiting for processes to exit.
Nov 29 02:04:00 np0005539504 systemd[1]: session-52.scope: Deactivated successfully.
Nov 29 02:04:00 np0005539504 systemd-logind[783]: Removed session 52.
Nov 29 02:04:00 np0005539504 systemd-logind[783]: New session 54 of user nova.
Nov 29 02:04:00 np0005539504 systemd[1]: Started Session 54 of User nova.
Nov 29 02:04:00 np0005539504 systemd[1]: session-54.scope: Deactivated successfully.
Nov 29 02:04:00 np0005539504 systemd-logind[783]: Session 54 logged out. Waiting for processes to exit.
Nov 29 02:04:00 np0005539504 systemd-logind[783]: Removed session 54.
Nov 29 02:04:00 np0005539504 nova_compute[187152]: 2025-11-29 07:04:00.388 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:04 np0005539504 nova_compute[187152]: 2025-11-29 07:04:04.781 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:05 np0005539504 nova_compute[187152]: 2025-11-29 07:04:05.395 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:05 np0005539504 podman[224327]: 2025-11-29 07:04:05.719202573 +0000 UTC m=+0.057075712 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:04:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:09.110 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:09.111 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:04:09 np0005539504 nova_compute[187152]: 2025-11-29 07:04:09.112 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:09 np0005539504 nova_compute[187152]: 2025-11-29 07:04:09.783 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:10 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:04:10 np0005539504 nova_compute[187152]: 2025-11-29 07:04:10.401 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:10 np0005539504 systemd[224305]: Activating special unit Exit the Session...
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped target Main User Target.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped target Basic System.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped target Paths.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped target Sockets.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped target Timers.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:04:10 np0005539504 systemd[224305]: Closed D-Bus User Message Bus Socket.
Nov 29 02:04:10 np0005539504 systemd[224305]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:04:10 np0005539504 systemd[224305]: Removed slice User Application Slice.
Nov 29 02:04:10 np0005539504 systemd[224305]: Reached target Shutdown.
Nov 29 02:04:10 np0005539504 systemd[224305]: Finished Exit the Session.
Nov 29 02:04:10 np0005539504 systemd[224305]: Reached target Exit the Session.
Nov 29 02:04:10 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:04:10 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:04:10 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:04:10 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:04:10 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:04:10 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:04:10 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:04:10 np0005539504 podman[224346]: 2025-11-29 07:04:10.468537806 +0000 UTC m=+0.058393068 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:04:10 np0005539504 podman[224347]: 2025-11-29 07:04:10.50162943 +0000 UTC m=+0.084719209 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:04:14 np0005539504 podman[224392]: 2025-11-29 07:04:14.705566059 +0000 UTC m=+0.050840835 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:04:14 np0005539504 podman[224393]: 2025-11-29 07:04:14.742357333 +0000 UTC m=+0.084851394 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:04:14 np0005539504 nova_compute[187152]: 2025-11-29 07:04:14.784 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539504 nova_compute[187152]: 2025-11-29 07:04:15.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:17.114 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:19 np0005539504 nova_compute[187152]: 2025-11-29 07:04:19.785 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:20 np0005539504 nova_compute[187152]: 2025-11-29 07:04:20.406 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:22 np0005539504 podman[224440]: 2025-11-29 07:04:22.722574983 +0000 UTC m=+0.062963052 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:04:22 np0005539504 nova_compute[187152]: 2025-11-29 07:04:22.859 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:22 np0005539504 nova_compute[187152]: 2025-11-29 07:04:22.860 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:22.919 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:22.919 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:22.919 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:23 np0005539504 nova_compute[187152]: 2025-11-29 07:04:23.074 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.444 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.445 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.452 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.452 187156 INFO nova.compute.claims [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.788 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.800 187156 DEBUG nova.compute.provider_tree [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.819 187156 DEBUG nova.scheduler.client.report [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.840 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.841 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.843 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.844 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.863 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.920 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.920 187156 DEBUG nova.network.neutron [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.949 187156 INFO nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.979 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.987 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.987 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.995 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:04:24 np0005539504 nova_compute[187152]: 2025-11-29 07:04:24.996 187156 INFO nova.compute.claims [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:04:25 np0005539504 nova_compute[187152]: 2025-11-29 07:04:25.166 187156 DEBUG nova.policy [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:04:25 np0005539504 nova_compute[187152]: 2025-11-29 07:04:25.431 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.122 187156 DEBUG nova.compute.provider_tree [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.206 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.207 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.208 187156 INFO nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Creating image(s)#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.208 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.209 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.209 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.223 187156 DEBUG nova.scheduler.client.report [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.227 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.249 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.250 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.290 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.294 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.295 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.306 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.363 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.364 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.461 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk 1073741824" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.463 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.463 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.529 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.530 187156 DEBUG nova.network.neutron [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.533 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.535 187156 DEBUG nova.virt.disk.api [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.536 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.602 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.603 187156 DEBUG nova.virt.disk.api [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.603 187156 DEBUG nova.objects.instance [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.714 187156 INFO nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.735 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.736 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Ensure instance console log exists: /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.736 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.736 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.737 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:26 np0005539504 nova_compute[187152]: 2025-11-29 07:04:26.745 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.066 187156 DEBUG nova.network.neutron [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.066 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.344 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.345 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.346 187156 INFO nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Creating image(s)#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.347 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "/var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.347 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.348 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.366 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.433 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.434 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.435 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.447 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.474 187156 DEBUG nova.network.neutron [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Successfully created port: b7078e73-f0e3-441a-843e-8920e38aec30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.502 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:27 np0005539504 nova_compute[187152]: 2025-11-29 07:04:27.503 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:27 np0005539504 podman[224484]: 2025-11-29 07:04:27.72298512 +0000 UTC m=+0.061128283 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.479 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk 1073741824" returned: 0 in 0.975s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.480 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.481 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.544 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.545 187156 DEBUG nova.virt.disk.api [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Checking if we can resize image /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.546 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.607 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.608 187156 DEBUG nova.virt.disk.api [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Cannot resize image /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.608 187156 DEBUG nova.objects.instance [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'migration_context' on Instance uuid 1c73b1b4-2bab-4451-843e-0f70db66eb9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.824 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.825 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Ensure instance console log exists: /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.826 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.827 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.827 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.831 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.842 187156 WARNING nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.850 187156 DEBUG nova.virt.libvirt.host [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.851 187156 DEBUG nova.virt.libvirt.host [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.861 187156 DEBUG nova.virt.libvirt.host [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.863 187156 DEBUG nova.virt.libvirt.host [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.865 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.865 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.866 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.867 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.867 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.868 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.868 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.869 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.869 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.870 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.871 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.871 187156 DEBUG nova.virt.hardware [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.878 187156 DEBUG nova.objects.instance [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c73b1b4-2bab-4451-843e-0f70db66eb9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:28 np0005539504 nova_compute[187152]: 2025-11-29 07:04:28.906 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <uuid>1c73b1b4-2bab-4451-843e-0f70db66eb9b</uuid>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <name>instance-00000043</name>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersOnMultiNodesTest-server-2000512745</nova:name>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:04:28</nova:creationTime>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:user uuid="0c56214d54944034ac2500edac59a239">tempest-ServersOnMultiNodesTest-2086403841-project-member</nova:user>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:        <nova:project uuid="d09f64becda14f30b831bdf7371d586b">tempest-ServersOnMultiNodesTest-2086403841</nova:project>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <entry name="serial">1c73b1b4-2bab-4451-843e-0f70db66eb9b</entry>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <entry name="uuid">1c73b1b4-2bab-4451-843e-0f70db66eb9b</entry>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.config"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/console.log" append="off"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:04:28 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:04:28 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:04:28 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:04:28 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.151 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.151 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.152 187156 INFO nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Using config drive#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.308 187156 INFO nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Creating config drive at /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.config#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.313 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppr0mavzl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.437 187156 DEBUG oslo_concurrency.processutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppr0mavzl" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:29 np0005539504 systemd-machined[153423]: New machine qemu-34-instance-00000043.
Nov 29 02:04:29 np0005539504 systemd[1]: Started Virtual Machine qemu-34-instance-00000043.
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.792 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.919 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399869.9182646, 1c73b1b4-2bab-4451-843e-0f70db66eb9b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.920 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.923 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.924 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.929 187156 INFO nova.virt.libvirt.driver [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Instance spawned successfully.#033[00m
Nov 29 02:04:29 np0005539504 nova_compute[187152]: 2025-11-29 07:04:29.929 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.277 187156 DEBUG nova.network.neutron [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Successfully updated port: b7078e73-f0e3-441a-843e-8920e38aec30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.435 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.730 187156 DEBUG nova.compute.manager [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.731 187156 DEBUG nova.compute.manager [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing instance network info cache due to event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.731 187156 DEBUG oslo_concurrency.lockutils [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.731 187156 DEBUG oslo_concurrency.lockutils [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.732 187156 DEBUG nova.network.neutron [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.753 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.759 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.760 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.761 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.762 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.763 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.763 187156 DEBUG nova.virt.libvirt.driver [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.770 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.835 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.966 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.967 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399869.9188535, 1c73b1b4-2bab-4451-843e-0f70db66eb9b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:30 np0005539504 nova_compute[187152]: 2025-11-29 07:04:30.967 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.010 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.015 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.034 187156 INFO nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Took 3.69 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.035 187156 DEBUG nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.037 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.119 187156 INFO nova.compute.manager [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Took 6.17 seconds to build instance.#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.138 187156 DEBUG oslo_concurrency.lockutils [None req-a6e42a2d-a291-4e8c-bec9-74f72f40f747 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.255 187156 DEBUG nova.network.neutron [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.583 187156 DEBUG nova.network.neutron [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.605 187156 DEBUG oslo_concurrency.lockutils [req-63b1ebca-edcf-451f-bd11-be95b385efb9 req-14799bb5-04e3-4ad3-bbd8-626effcfb74d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.605 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.606 187156 DEBUG nova.network.neutron [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:04:31 np0005539504 nova_compute[187152]: 2025-11-29 07:04:31.758 187156 DEBUG nova.network.neutron [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.302 187156 DEBUG nova.network.neutron [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.616 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.617 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance network_info: |[{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.620 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start _get_guest_xml network_info=[{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.628 187156 WARNING nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.641 187156 DEBUG nova.virt.libvirt.host [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.643 187156 DEBUG nova.virt.libvirt.host [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.647 187156 DEBUG nova.virt.libvirt.host [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.648 187156 DEBUG nova.virt.libvirt.host [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.650 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.650 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.651 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.651 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.651 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.652 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.652 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.652 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.652 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.653 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.653 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.653 187156 DEBUG nova.virt.hardware [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.656 187156 DEBUG nova.virt.libvirt.vif [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.657 187156 DEBUG nova.network.os_vif_util [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.658 187156 DEBUG nova.network.os_vif_util [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.659 187156 DEBUG nova.objects.instance [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.724 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <uuid>9223f44a-297e-4db1-9f44-ee0694c4e258</uuid>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <name>instance-00000042</name>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestJSON-server-664171356</nova:name>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:04:33</nova:creationTime>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        <nova:port uuid="b7078e73-f0e3-441a-843e-8920e38aec30">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <entry name="serial">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <entry name="uuid">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:1e:a3:23"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <target dev="tapb7078e73-f0"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log" append="off"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:04:33 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:04:33 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:04:33 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:04:33 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.725 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Preparing to wait for external event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.726 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.726 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.726 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.727 187156 DEBUG nova.virt.libvirt.vif [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.727 187156 DEBUG nova.network.os_vif_util [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.728 187156 DEBUG nova.network.os_vif_util [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.728 187156 DEBUG os_vif [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.729 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.729 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.730 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.739 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.739 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7078e73-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.740 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7078e73-f0, col_values=(('external_ids', {'iface-id': 'b7078e73-f0e3-441a-843e-8920e38aec30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a3:23', 'vm-uuid': '9223f44a-297e-4db1-9f44-ee0694c4e258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.741 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:33 np0005539504 NetworkManager[55210]: <info>  [1764399873.7435] manager: (tapb7078e73-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.744 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.753 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:33 np0005539504 nova_compute[187152]: 2025-11-29 07:04:33.756 187156 INFO os_vif [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:04:34 np0005539504 nova_compute[187152]: 2025-11-29 07:04:34.791 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:35 np0005539504 nova_compute[187152]: 2025-11-29 07:04:35.873 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:35 np0005539504 nova_compute[187152]: 2025-11-29 07:04:35.873 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:35 np0005539504 nova_compute[187152]: 2025-11-29 07:04:35.873 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:1e:a3:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:04:35 np0005539504 nova_compute[187152]: 2025-11-29 07:04:35.874 187156 INFO nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Using config drive#033[00m
Nov 29 02:04:36 np0005539504 podman[224540]: 2025-11-29 07:04:36.755033538 +0000 UTC m=+0.088342779 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.280 187156 INFO nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Creating config drive at /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config#033[00m
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.287 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp70cy1dtm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.410 187156 DEBUG oslo_concurrency.processutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp70cy1dtm" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:38 np0005539504 kernel: tapb7078e73-f0: entered promiscuous mode
Nov 29 02:04:38 np0005539504 NetworkManager[55210]: <info>  [1764399878.4798] manager: (tapb7078e73-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.481 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:38Z|00232|binding|INFO|Claiming lport b7078e73-f0e3-441a-843e-8920e38aec30 for this chassis.
Nov 29 02:04:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:38Z|00233|binding|INFO|b7078e73-f0e3-441a-843e-8920e38aec30: Claiming fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:04:38 np0005539504 systemd-udevd[224574]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:04:38 np0005539504 NetworkManager[55210]: <info>  [1764399878.5345] device (tapb7078e73-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:04:38 np0005539504 NetworkManager[55210]: <info>  [1764399878.5353] device (tapb7078e73-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:04:38 np0005539504 systemd-machined[153423]: New machine qemu-35-instance-00000042.
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.543 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:38Z|00234|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 ovn-installed in OVS
Nov 29 02:04:38 np0005539504 systemd[1]: Started Virtual Machine qemu-35-instance-00000042.
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.549 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.741 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.839 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399878.8385437, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:38 np0005539504 nova_compute[187152]: 2025-11-29 07:04:38.840 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:39 np0005539504 nova_compute[187152]: 2025-11-29 07:04:39.794 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:40 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:40Z|00235|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 up in Southbound
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.562 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.565 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.567 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.584 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3d96fdc2-a879-4603-9225-508678952daa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.586 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.590 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.590 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3745e704-4cf0-40ef-9b4a-78cb31ad0988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.592 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdb2400-fefe-4ede-8f87-27fd53e9f4ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.598 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.606 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce739b9-8283-419b-a7a4-18a3713d629c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.610 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399878.8393006, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.611 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.627 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[46a5cedd-8eb7-4f94-ba29-d5a60c60a26e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.634 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.640 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.662 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.672 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[beedb0c4-d174-430d-ab70-4913528b8439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 NetworkManager[55210]: <info>  [1764399880.6797] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/112)
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.678 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b5153449-8ab3-481f-a17b-5452137ba6dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 podman[224595]: 2025-11-29 07:04:40.712241797 +0000 UTC m=+0.073000523 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:04:40 np0005539504 podman[224597]: 2025-11-29 07:04:40.719325459 +0000 UTC m=+0.079102429 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.723 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6adedb4e-0673-4acf-ad77-4e9beba1ce08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.726 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ea9ef5-20e0-4caf-88ef-8a39a946cb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 NetworkManager[55210]: <info>  [1764399880.7501] device (tap9226dea3-60): carrier: link connected
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.759 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a9596a-f3e8-4f16-8d88-f6751c864601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.774 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[475a69bd-b816-454d-a3f2-1a08cd7a53b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534363, 'reachable_time': 18981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224659, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.789 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7302e0a8-438f-494a-a2c0-d69ec718970c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534363, 'tstamp': 534363}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224660, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.803 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7907797b-1269-4230-82d4-db4b12dee4f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534363, 'reachable_time': 18981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224661, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.831 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4d2547-e2fe-4775-83cf-854a16697cac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.893 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c9292414-e10f-4177-89ce-51bb70f86dc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.895 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.896 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.896 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:40 np0005539504 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.903 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.904 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:40 np0005539504 nova_compute[187152]: 2025-11-29 07:04:40.907 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.908 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.909 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3154a9cc-69f4-4582-8aa0-615c75f3f6b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.910 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:04:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:40.912 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:04:40 np0005539504 NetworkManager[55210]: <info>  [1764399880.9601] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Nov 29 02:04:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:41Z|00236|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.189 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.219 187156 DEBUG nova.compute.manager [req-132208d5-b642-4421-9b54-04125ac31e34 req-2be18daf-07ce-47ba-a217-c952596ee2c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.219 187156 DEBUG oslo_concurrency.lockutils [req-132208d5-b642-4421-9b54-04125ac31e34 req-2be18daf-07ce-47ba-a217-c952596ee2c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.220 187156 DEBUG oslo_concurrency.lockutils [req-132208d5-b642-4421-9b54-04125ac31e34 req-2be18daf-07ce-47ba-a217-c952596ee2c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.220 187156 DEBUG oslo_concurrency.lockutils [req-132208d5-b642-4421-9b54-04125ac31e34 req-2be18daf-07ce-47ba-a217-c952596ee2c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.220 187156 DEBUG nova.compute.manager [req-132208d5-b642-4421-9b54-04125ac31e34 req-2be18daf-07ce-47ba-a217-c952596ee2c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Processing event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.221 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.226 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399881.2259412, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.227 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.229 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.233 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance spawned successfully.#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.234 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.270 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.275 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.276 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.276 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.276 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.277 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.277 187156 DEBUG nova.virt.libvirt.driver [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.282 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:41 np0005539504 podman[224695]: 2025-11-29 07:04:41.308448315 +0000 UTC m=+0.025420077 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:04:41 np0005539504 podman[224695]: 2025-11-29 07:04:41.532592601 +0000 UTC m=+0.249564363 container create 726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:04:41 np0005539504 systemd[1]: Started libpod-conmon-726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1.scope.
Nov 29 02:04:41 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:04:41 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed3e84c827bf12763a0ef4000f9c3ea49e5320d0ab4626549ce23dfa12a6489f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:04:41 np0005539504 podman[224695]: 2025-11-29 07:04:41.662801968 +0000 UTC m=+0.379773750 container init 726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:04:41 np0005539504 podman[224695]: 2025-11-29 07:04:41.670483195 +0000 UTC m=+0.387454947 container start 726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:04:41 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [NOTICE]   (224715) : New worker (224717) forked
Nov 29 02:04:41 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [NOTICE]   (224715) : Loading success.
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.761 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.853 187156 INFO nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Took 15.65 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:41 np0005539504 nova_compute[187152]: 2025-11-29 07:04:41.853 187156 DEBUG nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:42 np0005539504 nova_compute[187152]: 2025-11-29 07:04:42.033 187156 INFO nova.compute.manager [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Took 18.04 seconds to build instance.#033[00m
Nov 29 02:04:42 np0005539504 nova_compute[187152]: 2025-11-29 07:04:42.065 187156 DEBUG oslo_concurrency.lockutils [None req-f4e5ff05-404e-455e-a533-6174818981dc e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.647 187156 DEBUG nova.compute.manager [req-abf30a79-4974-424d-afce-7fc55c1a2c57 req-a5b8a672-e43c-4631-af2c-ca16efad486d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.647 187156 DEBUG oslo_concurrency.lockutils [req-abf30a79-4974-424d-afce-7fc55c1a2c57 req-a5b8a672-e43c-4631-af2c-ca16efad486d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.647 187156 DEBUG oslo_concurrency.lockutils [req-abf30a79-4974-424d-afce-7fc55c1a2c57 req-a5b8a672-e43c-4631-af2c-ca16efad486d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.648 187156 DEBUG oslo_concurrency.lockutils [req-abf30a79-4974-424d-afce-7fc55c1a2c57 req-a5b8a672-e43c-4631-af2c-ca16efad486d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.648 187156 DEBUG nova.compute.manager [req-abf30a79-4974-424d-afce-7fc55c1a2c57 req-a5b8a672-e43c-4631-af2c-ca16efad486d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.648 187156 WARNING nova.compute.manager [req-abf30a79-4974-424d-afce-7fc55c1a2c57 req-a5b8a672-e43c-4631-af2c-ca16efad486d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:04:43 np0005539504 nova_compute[187152]: 2025-11-29 07:04:43.744 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:44 np0005539504 nova_compute[187152]: 2025-11-29 07:04:44.795 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:44 np0005539504 nova_compute[187152]: 2025-11-29 07:04:44.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.164 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:45 np0005539504 NetworkManager[55210]: <info>  [1764399885.1656] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Nov 29 02:04:45 np0005539504 NetworkManager[55210]: <info>  [1764399885.1665] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:45 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:45Z|00237|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.282 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.622 187156 DEBUG nova.compute.manager [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.623 187156 DEBUG nova.compute.manager [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing instance network info cache due to event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.623 187156 DEBUG oslo_concurrency.lockutils [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.624 187156 DEBUG oslo_concurrency.lockutils [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:45 np0005539504 nova_compute[187152]: 2025-11-29 07:04:45.624 187156 DEBUG nova.network.neutron [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:04:45 np0005539504 podman[224750]: 2025-11-29 07:04:45.798599685 +0000 UTC m=+0.099543530 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 02:04:45 np0005539504 podman[224749]: 2025-11-29 07:04:45.798711638 +0000 UTC m=+0.100195238 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:04:46 np0005539504 nova_compute[187152]: 2025-11-29 07:04:46.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:47.964 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000043', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd09f64becda14f30b831bdf7371d586b', 'user_id': '0c56214d54944034ac2500edac59a239', 'hostId': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:47.968 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'name': 'tempest-ServerActionsTestJSON-server-664171356', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000042', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e6c366001df43fb91731faf7a9578fc', 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'hostId': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:47.968 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:47.999 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.001 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.044 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.045 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85ff309f-0731-4996-abf6-3d3d7e203082', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:47.968842', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b12a4722-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '9b02859c35c2fbbcbc87303f7cd650554c8de0e648ec828656e2019ff884274f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:47.968842', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12a6220-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '5a19f3fed69e58f3fc8e68e836d892cf8f2f3f341f337846cd51a82d94feb21d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:47.968842', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1310116-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '255316080ecce94aafe3be03bd7d32c54da9af801c01db5242a63b0e8301da4c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:47.968842', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1310f44-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '5328c302360218f5a6231ab6af3bb8d2cb0fb40e958a081bab9cb8b664c7ca99'}]}, 'timestamp': '2025-11-29 07:04:48.045469', '_unique_id': '338ac0dae0c24b4f94c788aaae0c0178'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.049 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.050 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>]
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.050 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.050 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>]
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.061 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.061 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.071 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.072 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ecf72d6-2696-4d69-b547-b1f6074067b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.051017', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1339188-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.985711839, 'message_signature': 'e1ff3340ea69e186f23e0416b266cab6b20156c137e77c9f691b861a78b1cba7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.051017', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1339f16-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.985711839, 'message_signature': '5f3d74b4718a875fb1ad471f4276f841301e75f1dfc7773372fa45da5d13b0b6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.051017', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b135285e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.996816639, 'message_signature': 'db4b2742cd56a157576cc206e80ed7c388b80ab049025417d1473b0c0a4daf8a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.051017', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1353812-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.996816639, 'message_signature': '840ba4c2296afd41f163f37330936158d5500bdc46bef1761621d6a8e8176a5c'}]}, 'timestamp': '2025-11-29 07:04:48.072617', '_unique_id': 'e627f86cae8f4db395aedcc0f02bf1ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.079 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9223f44a-297e-4db1-9f44-ee0694c4e258 / tapb7078e73-f0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.079 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75e51d35-481a-4ef6-924a-656acaba408b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.074926', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b1364f18-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': '79507144a0d57b28348766cdfa796e7b7ec07943e6ef281784009ba14bb1058c'}]}, 'timestamp': '2025-11-29 07:04:48.079828', '_unique_id': '3ccbb9240c034f869a58f336d1f2ddc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.081 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.097 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.112 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.112 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9223f44a-297e-4db1-9f44-ee0694c4e258: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '302e8c57-6a13-469e-bfae-cdfdc59ca2f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'timestamp': '2025-11-29T07:04:48.081980', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b1392a08-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.032433072, 'message_signature': '7567f0dad5364c8c2153665486edb189cc265f4d4b262ffbe4764ad7bdf8891d'}]}, 'timestamp': '2025-11-29 07:04:48.116503', '_unique_id': 'c336fed436244a729601c2d756fec42a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.119 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e766249-ec3e-4fe4-99b1-c6fff87c1082', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.119593', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13c7212-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': 'b29e9ee24dcaf565fc8628137cbf3676fe0dec8428daaf19696f075eb6858e43'}]}, 'timestamp': '2025-11-29 07:04:48.120063', '_unique_id': 'b9719383b1534bea9b1319f6ac4ccb30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.122 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a468260e-79f3-4bcf-82af-825da54bfb07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.122245', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13cd928-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': 'eeeffe510bf3f8b55a7dd9a5323460dd68ac65341ecfb9573958a89741949441'}]}, 'timestamp': '2025-11-29 07:04:48.122662', '_unique_id': '5d3f09d46a534e8792e9d37923360a41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.124 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.124 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>]
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.125 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.125 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.125 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.126 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a470e62-69d1-49df-90d1-07b1bb07f9fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.125225', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13d4d22-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.985711839, 'message_signature': 'e6c2ffa8834e61c6e72f64a0e975b3c2e63b0f759145eadff37230baf80b434b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.125225', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13d5a4c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.985711839, 'message_signature': 'bc9a51643e1c8217b3266e74ad04874d956a2471183c67269d1a88eb9ac3c985'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.125225', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13d6686-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.996816639, 'message_signature': '2a10f8dc3baa776ac6b7becfe736c744b8a02746b70222da3f27761de7860d5c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.125225', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13d7450-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.996816639, 'message_signature': 'ab7aa7dc714878e5c3b5a03afd73d7a9f90e9138a8ad111ac0f7566ba4e57181'}]}, 'timestamp': '2025-11-29 07:04:48.126611', '_unique_id': '1c96682c27bd45e18c10b920557064d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.129 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96acffce-a1b4-4140-bad5-f46712580205', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.128989', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13dded6-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': '4a927edee3ae004bbde8016158c033853bdaeafee0694a0c62a529cef99e015b'}]}, 'timestamp': '2025-11-29 07:04:48.129358', '_unique_id': '0ac593f821b6478ba545efdc709d7171'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.131 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2965017-6f26-4d1e-948c-2a12cc4f4854', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.131523', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13e41e6-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': 'f36ab2c1412e7f82f964bfc11940188c4de5e4cedfc13997d6c8882f662b3e15'}]}, 'timestamp': '2025-11-29 07:04:48.131893', '_unique_id': '7c70c5db02544e6e815a24ca312b7c3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.133 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bbdb2ed-3283-460e-8978-97b19ecf6ad0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.133943', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13ea140-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': 'f3ae9948e0758b37ce9d5d2b071dd56ecbdfe3bd0ac2189e503ab0d469a6b537'}]}, 'timestamp': '2025-11-29 07:04:48.134338', '_unique_id': 'fcabba3fe4f44292a31a0a5c3ad5ab50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.136 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-2000512745>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-664171356>]
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.137 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebb7b3ff-1496-4a7a-9ee6-98a566736daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.137168', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13f1f4e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': '415c0fdc49e5de4f5a2b3d426cb0fc3713636ddc14c1ccd3998735494bfa3e6f'}]}, 'timestamp': '2025-11-29 07:04:48.137564', '_unique_id': '503f6c6363ab46e1820ca53375e3382c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.139 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10802d67-f23b-441a-8d7b-d1e4b10e8851', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.139809', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13f86aa-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': '5ed3dc04b13f4873e6abfc7f350bae8134c28878c41cae4e5ebfa46eb31ecf4c'}]}, 'timestamp': '2025-11-29 07:04:48.140219', '_unique_id': 'fdf5f06becd649cdbd9a4d452b0cbfdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.142 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bef77ab-180e-4660-a3f6-43e71748f923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.142468', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b13fed5c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': '98bcf2f02cc7ab9a5707aec466d956b22c695bb5cb4215eb8ab7735a5ebd9d69'}]}, 'timestamp': '2025-11-29 07:04:48.142828', '_unique_id': 'cdb45fc3995f46b582143aa6a7b3f600'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.145 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.write.latency volume: 4068265502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.145 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.145 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.146 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9be722f7-87d6-4833-970b-6d17707d4b17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4068265502, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.144979', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1404f2c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '4d6b09108f74357c7b98a27a38ec1b812730469d13baa70c62b6bda1f2df044f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.144979', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1405c92-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '5b3a6019b3b585545b3b55a491cbd9c32dfdd418763ecc394ba83b3ee33b2a2b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.144979', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14068ae-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': 'ab88ace9fa88e55fd7bcd792477ed152a612558fa9d35ba082b523ee8302dd41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.144979', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1407d1c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '57e6758365a700eeef20756f01ece82e2125e9074704f19bb4a9c835aa1ec0e2'}]}, 'timestamp': '2025-11-29 07:04:48.146526', '_unique_id': '80741ff822964a15aa310fe92170f634'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.148 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.read.bytes volume: 30493184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.149 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.149 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.149 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '889907d7-847b-46e4-a26e-68eaf609e1ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30493184, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.148733', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b140e1e4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': 'bcc1632af3daff3a5900ba6d51e283cbaadd948c2a379ce6c8cb7da769f39846'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.148733', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b140ee1e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '442bdca2ee44cf1826a6d97e0697d41487aff9f2b189edcb435d0ca512dc00f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.148733', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b140fb2a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '7fea18e4506ae94878ee988fc915ba9e53b57f258c6474631af039cb95b57ec8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.148733', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1410728-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '6f8441dbf24f4f8c69a3958185ecad5af9dad67555f3fc6cf6491d4bda4ae369'}]}, 'timestamp': '2025-11-29 07:04:48.150024', '_unique_id': '7529395c8cad4be6beb9cf5b508ecb98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.152 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.152 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.153 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.153 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c439adca-714b-4bc0-8c47-ac70f81cf602', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.152304', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1416e84-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.985711839, 'message_signature': '9b9a7badc3b2cfbbf6b5444374d31b1ebe5c7a108b2a23ca17baf4e421aeae44'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.152304', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1417bcc-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.985711839, 'message_signature': 'd386e520170493f6aa00fe8b00b88783d394774648c228804500f450c03ec4f3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.152304', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b141882e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.996816639, 'message_signature': '7d0447c84115062daf362bec91d2a611cd467a606d8bd5211b40341932355b89'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.152304', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1419562-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.996816639, 'message_signature': '5611fb72033f080b3989aab0f92d388a596e50788e3f24615b54eef04923c96f'}]}, 'timestamp': '2025-11-29 07:04:48.153669', '_unique_id': '995ad4378d9b4032ac11c2122da9de55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.156 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccfd45de-ac40-42a9-a430-5e3b576da1a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:04:48.155987', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'b141fd9a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.011093645, 'message_signature': '6a796fa32d7089a708f15615daddec0d86fe29637cf246584d19c198e6a5a157'}]}, 'timestamp': '2025-11-29 07:04:48.156353', '_unique_id': '48b5e3e3b3c9406b983e4352ef42f133'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.158 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/cpu volume: 12020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.158 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/cpu volume: 6570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 nova_compute[187152]: 2025-11-29 07:04:48.156 187156 DEBUG nova.network.neutron [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updated VIF entry in instance network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:04:48 np0005539504 nova_compute[187152]: 2025-11-29 07:04:48.157 187156 DEBUG nova.network.neutron [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '962bddb1-0c88-4c9c-9471-eddd77115aa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12020000000, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'timestamp': '2025-11-29T07:04:48.158326', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b1425970-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.032433072, 'message_signature': '6351c828613caf0b7d574e34ea30657b5ac6884bee941f9dace463c9436fa8a6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6570000000, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'timestamp': '2025-11-29T07:04:48.158326', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b142664a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5351.047034266, 'message_signature': '89f8af74ec5aad3a939ee0bbadfdafde18f6a0a94c64dfccfbab3653d4ab90f4'}]}, 'timestamp': '2025-11-29 07:04:48.159013', '_unique_id': 'ab071450d1d34d81bcf247a0316c776e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.161 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.161 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.162 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.162 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd4e8d1b-6e67-4aea-9c19-5cd135cc727f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.161428', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b142d288-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '11f7c93654ebcc2179023de0ac8b8d07f1c0e837831f141dd62f454961d793ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.161428', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b142df76-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': 'e077120f3864a04c104c5a1f0b794b2a75c5b952242894679d23585c3232d7a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.161428', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1430118-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '195c40b5f6d8b9e65f40e25e6ea98dc28d5617ea495b93b5baed2dd4515383f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.161428', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1430df2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '009e08a32c8254e503d868336ad3a089c4a97a878df5895b35756d51a176d95b'}]}, 'timestamp': '2025-11-29 07:04:48.163316', '_unique_id': 'a610f2daddc6402fa88075002a19b7f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.166 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.read.latency volume: 459936346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.166 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.read.latency volume: 20543713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.166 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.latency volume: 310029942 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.167 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.latency volume: 597235 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b289890b-f972-46bc-9668-272e4c5fc722', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 459936346, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.166256', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1439074-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '965fbd7c16f229399c9e0035886c78f85ca1606f25660588333349c6caccb7c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20543713, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.166256', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1439de4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '820c38c292ce3c0febd00655234fa636c593d3feede1abb03542c7f333af2f84'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 310029942, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.166256', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b143a992-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': 'ddee4520746e381bb3d94af41c10bd1a90d768ff0cde83904007f8c21e4cd6d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 597235, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.166256', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b143b6c6-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': 'c24a5eecfa4c8bac74ea1cb971053b57f964a28d95c7f986b6a2c16a22169810'}]}, 'timestamp': '2025-11-29 07:04:48.167632', '_unique_id': '0968e4c4b4484727b69a1e677a182cf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.170 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.write.requests volume: 295 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.171 12 DEBUG ceilometer.compute.pollsters [-] 1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.171 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.171 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b21d6132-76de-4f59-93d6-7dd905120d10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 295, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-vda', 'timestamp': '2025-11-29T07:04:48.170803', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14441ae-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '372bca8e9af1ca0e0cbe732d7e43a189550b318b3687a651b588d0bf29329a17'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b-sda', 'timestamp': '2025-11-29T07:04:48.170803', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-2000512745', 'name': 'instance-00000043', 'instance_id': '1c73b1b4-2bab-4451-843e-0f70db66eb9b', 'instance_type': 'm1.nano', 'host': '3b38fbf15db00754a9709a75e822598839c59e5dd9fadc306bcff5aa', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14450f4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.90353578, 'message_signature': '4144943c874ef809edbeebcee5eb05fc4bef7fc6641b72a3d000f860a510bdd8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:04:48.170803', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1445d6a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '6dde0ed37ac1087631ba8e6771a018111a72753cd2667917a0db9f44a80ff6c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:04:48.170803', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1446a30-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5350.936333596, 'message_signature': '4508aeefa41a58f50bb801a495a0997c15206d014e0cfb2a71e988fa291f4874'}]}, 'timestamp': '2025-11-29 07:04:48.172227', '_unique_id': 'd63716fc26aa4ae3bd914c5c4e8c7a60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:04:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539504 nova_compute[187152]: 2025-11-29 07:04:48.276 187156 DEBUG oslo_concurrency.lockutils [req-242bb2d4-14b7-4f55-aec4-fe60ae1388c2 req-04a111cb-a463-4d2f-8977-dcbf9ca3da70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:48 np0005539504 nova_compute[187152]: 2025-11-29 07:04:48.747 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:48 np0005539504 nova_compute[187152]: 2025-11-29 07:04:48.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:49 np0005539504 nova_compute[187152]: 2025-11-29 07:04:49.797 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:49 np0005539504 nova_compute[187152]: 2025-11-29 07:04:49.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:49 np0005539504 nova_compute[187152]: 2025-11-29 07:04:49.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.629 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:52.628 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:04:52.631 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.713 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.715 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.760 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:04:52 np0005539504 nova_compute[187152]: 2025-11-29 07:04:52.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.621 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.622 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.634 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.635 187156 INFO nova.compute.claims [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:04:53 np0005539504 podman[224798]: 2025-11-29 07:04:53.723754358 +0000 UTC m=+0.068933294 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.752 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.883 187156 DEBUG nova.compute.provider_tree [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.914 187156 DEBUG nova.scheduler.client.report [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.951 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.964 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "dbeb75d8-a2d5-4f46-8b67-a79ea432d2c8" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.964 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "dbeb75d8-a2d5-4f46-8b67-a79ea432d2c8" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:53 np0005539504 nova_compute[187152]: 2025-11-29 07:04:53.986 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] No node specified, defaulting to compute-1.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.026 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "dbeb75d8-a2d5-4f46-8b67-a79ea432d2c8" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.027 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.117 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.118 187156 DEBUG nova.network.neutron [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.156 187156 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.184 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.337 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.338 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.338 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.338 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.389 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.390 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.391 187156 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Creating image(s)#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.392 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "/var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.392 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.393 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.408 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.496 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.497 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.498 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.515 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.587 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.588 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.630 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.632 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.632 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.700 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.701 187156 DEBUG nova.virt.disk.api [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Checking if we can resize image /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.701 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.731 187156 DEBUG nova.network.neutron [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.732 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.769 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.770 187156 DEBUG nova.virt.disk.api [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Cannot resize image /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.770 187156 DEBUG nova.objects.instance [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'migration_context' on Instance uuid f4b6bd8b-65a8-48e2-85d9-eb1526069b38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.789 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.789 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Ensure instance console log exists: /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.790 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.790 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.790 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.791 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.797 187156 WARNING nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.803 187156 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.803 187156 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.806 187156 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.807 187156 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.808 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.808 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.808 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.809 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.809 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.809 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.809 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.810 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.810 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.810 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.810 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.810 187156 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.814 187156 DEBUG nova.objects.instance [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'pci_devices' on Instance uuid f4b6bd8b-65a8-48e2-85d9-eb1526069b38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.826 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <uuid>f4b6bd8b-65a8-48e2-85d9-eb1526069b38</uuid>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <name>instance-00000048</name>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersOnMultiNodesTest-server-710160406-2</nova:name>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:04:54</nova:creationTime>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:user uuid="0c56214d54944034ac2500edac59a239">tempest-ServersOnMultiNodesTest-2086403841-project-member</nova:user>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:        <nova:project uuid="d09f64becda14f30b831bdf7371d586b">tempest-ServersOnMultiNodesTest-2086403841</nova:project>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <entry name="serial">f4b6bd8b-65a8-48e2-85d9-eb1526069b38</entry>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <entry name="uuid">f4b6bd8b-65a8-48e2-85d9-eb1526069b38</entry>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.config"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/console.log" append="off"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:04:54 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:04:54 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:04:54 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:04:54 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.886 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.886 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:54 np0005539504 nova_compute[187152]: 2025-11-29 07:04:54.887 187156 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Using config drive#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.054 187156 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Creating config drive at /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.config#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.070 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp88l_yvgl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.206 187156 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp88l_yvgl" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:55 np0005539504 systemd-machined[153423]: New machine qemu-36-instance-00000048.
Nov 29 02:04:55 np0005539504 systemd[1]: Started Virtual Machine qemu-36-instance-00000048.
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.741 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399895.7405095, f4b6bd8b-65a8-48e2-85d9-eb1526069b38 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.742 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.747 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.747 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.754 187156 INFO nova.virt.libvirt.driver [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Instance spawned successfully.#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.755 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.794 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.802 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.808 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.809 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.810 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.810 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.811 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.812 187156 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.860 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.860 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399895.7457745, f4b6bd8b-65a8-48e2-85d9-eb1526069b38 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.860 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.949 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:55 np0005539504 nova_compute[187152]: 2025-11-29 07:04:55.953 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:56 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:56Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:04:56 np0005539504 ovn_controller[95182]: 2025-11-29T07:04:56Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.195 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.344 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.372 187156 INFO nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Took 1.98 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.372 187156 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.443 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.444 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.444 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.444 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.444 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.710 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.713 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.713 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:56 np0005539504 nova_compute[187152]: 2025-11-29 07:04:56.713 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.183 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.211 187156 INFO nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Took 3.65 seconds to build instance.#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.259 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.260 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.329 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.336 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.396 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.398 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.420 187156 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.458 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.464 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.521 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.522 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.575 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.725 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.727 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5351MB free_disk=73.14287948608398GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.727 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.727 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.805 187156 DEBUG nova.compute.manager [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.806 187156 DEBUG oslo_concurrency.lockutils [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.806 187156 DEBUG oslo_concurrency.lockutils [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.806 187156 DEBUG oslo_concurrency.lockutils [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.807 187156 DEBUG nova.compute.manager [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:57 np0005539504 nova_compute[187152]: 2025-11-29 07:04:57.807 187156 WARNING nova.compute.manager [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:04:57 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:04:57 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:04:57 np0005539504 systemd-logind[783]: New session 55 of user nova.
Nov 29 02:04:57 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:04:57 np0005539504 podman[224897]: 2025-11-29 07:04:57.97044724 +0000 UTC m=+0.088157793 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:04:57 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:04:58 np0005539504 systemd[224919]: Queued start job for default target Main User Target.
Nov 29 02:04:58 np0005539504 systemd[224919]: Created slice User Application Slice.
Nov 29 02:04:58 np0005539504 systemd[224919]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:04:58 np0005539504 systemd[224919]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:04:58 np0005539504 systemd[224919]: Reached target Paths.
Nov 29 02:04:58 np0005539504 systemd[224919]: Reached target Timers.
Nov 29 02:04:58 np0005539504 systemd[224919]: Starting D-Bus User Message Bus Socket...
Nov 29 02:04:58 np0005539504 systemd[224919]: Starting Create User's Volatile Files and Directories...
Nov 29 02:04:58 np0005539504 systemd[224919]: Finished Create User's Volatile Files and Directories.
Nov 29 02:04:58 np0005539504 systemd[224919]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:04:58 np0005539504 systemd[224919]: Reached target Sockets.
Nov 29 02:04:58 np0005539504 systemd[224919]: Reached target Basic System.
Nov 29 02:04:58 np0005539504 systemd[224919]: Reached target Main User Target.
Nov 29 02:04:58 np0005539504 systemd[224919]: Startup finished in 144ms.
Nov 29 02:04:58 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:04:58 np0005539504 systemd[1]: Started Session 55 of User nova.
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.517 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration for instance 690daf8f-6151-4de9-85f6-b8a9fe51ea02 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 02:04:58 np0005539504 systemd[1]: session-55.scope: Deactivated successfully.
Nov 29 02:04:58 np0005539504 systemd-logind[783]: Session 55 logged out. Waiting for processes to exit.
Nov 29 02:04:58 np0005539504 systemd-logind[783]: Removed session 55.
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.757 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating resource usage from migration b2f30ee9-093d-4a50-9511-730851938837#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.757 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Starting to track incoming migration b2f30ee9-093d-4a50-9511-730851938837 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.761 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:58 np0005539504 systemd-logind[783]: New session 57 of user nova.
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.810 187156 WARNING nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 690daf8f-6151-4de9-85f6-b8a9fe51ea02 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.810 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 9223f44a-297e-4db1-9f44-ee0694c4e258 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.810 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 1c73b1b4-2bab-4451-843e-0f70db66eb9b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.810 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance f4b6bd8b-65a8-48e2-85d9-eb1526069b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.810 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.811 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=1088MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:04:58 np0005539504 systemd[1]: Started Session 57 of User nova.
Nov 29 02:04:58 np0005539504 systemd[1]: session-57.scope: Deactivated successfully.
Nov 29 02:04:58 np0005539504 systemd-logind[783]: Session 57 logged out. Waiting for processes to exit.
Nov 29 02:04:58 np0005539504 systemd-logind[783]: Removed session 57.
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.942 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.955 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.994 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:04:58 np0005539504 nova_compute[187152]: 2025-11-29 07:04:58.994 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:59 np0005539504 systemd-logind[783]: New session 58 of user nova.
Nov 29 02:04:59 np0005539504 systemd[1]: Started Session 58 of User nova.
Nov 29 02:04:59 np0005539504 systemd-logind[783]: Session 58 logged out. Waiting for processes to exit.
Nov 29 02:04:59 np0005539504 systemd[1]: session-58.scope: Deactivated successfully.
Nov 29 02:04:59 np0005539504 systemd-logind[783]: Removed session 58.
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.487 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.801 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.832 187156 INFO nova.network.neutron [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating port 6a0ff3c3-e368-4504-9884-40716725c901 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.939 187156 DEBUG nova.compute.manager [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.940 187156 DEBUG oslo_concurrency.lockutils [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.940 187156 DEBUG oslo_concurrency.lockutils [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.941 187156 DEBUG oslo_concurrency.lockutils [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.941 187156 DEBUG nova.compute.manager [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:59 np0005539504 nova_compute[187152]: 2025-11-29 07:04:59.941 187156 WARNING nova.compute.manager [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:05:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:00.635 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.011 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.012 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.013 187156 DEBUG nova.network.neutron [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.062 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.063 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.063 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.064 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.064 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.076 187156 INFO nova.compute.manager [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Terminating instance#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.084 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "refresh_cache-f4b6bd8b-65a8-48e2-85d9-eb1526069b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.085 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquired lock "refresh_cache-f4b6bd8b-65a8-48e2-85d9-eb1526069b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.085 187156 DEBUG nova.network.neutron [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.248 187156 DEBUG nova.network.neutron [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.620 187156 DEBUG nova.network.neutron [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.654 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Releasing lock "refresh_cache-f4b6bd8b-65a8-48e2-85d9-eb1526069b38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.655 187156 DEBUG nova.compute.manager [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:01 np0005539504 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000048.scope: Deactivated successfully.
Nov 29 02:05:01 np0005539504 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000048.scope: Consumed 6.287s CPU time.
Nov 29 02:05:01 np0005539504 systemd-machined[153423]: Machine qemu-36-instance-00000048 terminated.
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.926 187156 INFO nova.virt.libvirt.driver [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Instance destroyed successfully.#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.927 187156 DEBUG nova.objects.instance [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'resources' on Instance uuid f4b6bd8b-65a8-48e2-85d9-eb1526069b38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.947 187156 INFO nova.virt.libvirt.driver [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Deleting instance files /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38_del#033[00m
Nov 29 02:05:01 np0005539504 nova_compute[187152]: 2025-11-29 07:05:01.948 187156 INFO nova.virt.libvirt.driver [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Deletion of /var/lib/nova/instances/f4b6bd8b-65a8-48e2-85d9-eb1526069b38_del complete#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.082 187156 INFO nova.compute.manager [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.082 187156 DEBUG oslo.service.loopingcall [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.083 187156 DEBUG nova.compute.manager [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.083 187156 DEBUG nova.network.neutron [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.120 187156 DEBUG nova.compute.manager [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-changed-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.121 187156 DEBUG nova.compute.manager [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Refreshing instance network info cache due to event network-changed-6a0ff3c3-e368-4504-9884-40716725c901. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.121 187156 DEBUG oslo_concurrency.lockutils [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.239 187156 DEBUG nova.network.neutron [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.259 187156 DEBUG nova.network.neutron [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.277 187156 INFO nova.compute.manager [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Took 0.19 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.387 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.388 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.491 187156 DEBUG nova.network.neutron [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.511 187156 DEBUG nova.compute.provider_tree [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.563 187156 DEBUG nova.scheduler.client.report [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.569 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.573 187156 DEBUG oslo_concurrency.lockutils [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.574 187156 DEBUG nova.network.neutron [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Refreshing network info cache for port 6a0ff3c3-e368-4504-9884-40716725c901 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.614 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.684 187156 INFO nova.scheduler.client.report [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Deleted allocations for instance f4b6bd8b-65a8-48e2-85d9-eb1526069b38#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.760 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.761 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.762 187156 INFO nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Creating image(s)#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.762 187156 DEBUG nova.objects.instance [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.791 187156 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.835 187156 DEBUG oslo_concurrency.lockutils [None req-721f8dc2-e048-454d-8bbb-1da5a4ab367e 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "f4b6bd8b-65a8-48e2-85d9-eb1526069b38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.857 187156 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.857 187156 DEBUG nova.virt.disk.api [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.858 187156 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.921 187156 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.922 187156 DEBUG nova.virt.disk.api [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.942 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.943 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Ensure instance console log exists: /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.943 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.944 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.944 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.947 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Start _get_guest_xml network_info=[{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.952 187156 WARNING nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.956 187156 DEBUG nova.virt.libvirt.host [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.957 187156 DEBUG nova.virt.libvirt.host [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.960 187156 DEBUG nova.virt.libvirt.host [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.960 187156 DEBUG nova.virt.libvirt.host [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.962 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.962 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.962 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.962 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.962 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.963 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.963 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.963 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.963 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.963 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.963 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.964 187156 DEBUG nova.virt.hardware [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.964 187156 DEBUG nova.objects.instance [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:02 np0005539504 nova_compute[187152]: 2025-11-29 07:05:02.987 187156 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.045 187156 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.046 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.046 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.047 187156 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.048 187156 DEBUG nova.virt.libvirt.vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:59Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.048 187156 DEBUG nova.network.os_vif_util [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.049 187156 DEBUG nova.network.os_vif_util [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.051 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <uuid>690daf8f-6151-4de9-85f6-b8a9fe51ea02</uuid>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <name>instance-0000003f</name>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <memory>196608</memory>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:name>tempest-DeleteServersTestJSON-server-660939301</nova:name>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:05:02</nova:creationTime>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.micro">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:memory>192</nova:memory>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        <nova:port uuid="6a0ff3c3-e368-4504-9884-40716725c901">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <entry name="serial">690daf8f-6151-4de9-85f6-b8a9fe51ea02</entry>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <entry name="uuid">690daf8f-6151-4de9-85f6-b8a9fe51ea02</entry>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:15:c1:31"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <target dev="tap6a0ff3c3-e3"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/console.log" append="off"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:05:03 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:05:03 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:05:03 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:05:03 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.052 187156 DEBUG nova.virt.libvirt.vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:59Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.053 187156 DEBUG nova.network.os_vif_util [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.053 187156 DEBUG nova.network.os_vif_util [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.053 187156 DEBUG os_vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.054 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.054 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.055 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.058 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a0ff3c3-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.058 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a0ff3c3-e3, col_values=(('external_ids', {'iface-id': '6a0ff3c3-e368-4504-9884-40716725c901', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:c1:31', 'vm-uuid': '690daf8f-6151-4de9-85f6-b8a9fe51ea02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.0617] manager: (tap6a0ff3c3-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.061 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.068 187156 INFO os_vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3')#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.212 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.212 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.213 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:15:c1:31, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.213 187156 INFO nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Using config drive#033[00m
Nov 29 02:05:03 np0005539504 kernel: tap6a0ff3c3-e3: entered promiscuous mode
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.2699] manager: (tap6a0ff3c3-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Nov 29 02:05:03 np0005539504 systemd-udevd[224947]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:03Z|00238|binding|INFO|Claiming lport 6a0ff3c3-e368-4504-9884-40716725c901 for this chassis.
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:03Z|00239|binding|INFO|6a0ff3c3-e368-4504-9884-40716725c901: Claiming fa:16:3e:15:c1:31 10.100.0.14
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.284 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:03Z|00240|binding|INFO|Setting lport 6a0ff3c3-e368-4504-9884-40716725c901 ovn-installed in OVS
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.286 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.2918] device (tap6a0ff3c3-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.2927] device (tap6a0ff3c3-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:05:03 np0005539504 systemd-machined[153423]: New machine qemu-37-instance-0000003f.
Nov 29 02:05:03 np0005539504 systemd[1]: Started Virtual Machine qemu-37-instance-0000003f.
Nov 29 02:05:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:03Z|00241|binding|INFO|Setting lport 6a0ff3c3-e368-4504-9884-40716725c901 up in Southbound
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.558 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:c1:31 10.100.0.14'], port_security=['fa:16:3e:15:c1:31 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '6', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6a0ff3c3-e368-4504-9884-40716725c901) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.560 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6a0ff3c3-e368-4504-9884-40716725c901 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.562 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.572 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[05b33f44-85dc-4f9d-9b67-6384973e1199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.573 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.576 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.576 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[500e306b-6142-42d5-8c68-e8b174a5c545]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.577 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4954dcb5-d9c8-484b-8e5b-43beab880707]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.589 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[42eb9d88-5de4-4b2d-b644-dfc5d1f4a3a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.611 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[488336c4-9337-4ec2-a34a-9fd443012ee2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.641 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9245a1-20fd-499f-b6c1-b5d0624865ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.647 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2e855a-41dc-438e-8b60-81419c97037b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.6485] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/118)
Nov 29 02:05:03 np0005539504 systemd-udevd[224981]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.678 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[115932c8-67c9-44ae-a96d-d72e99a946f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.681 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd5f174-0077-4c5e-b6ab-3ec2853709d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.7022] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.707 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[efc71b92-0b53-4826-b3ed-3177d02c4eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.725 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc7a38a-328f-4e6f-bf23-26d5701316c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536658, 'reachable_time': 31745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225014, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.748 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6513e9-fb87-419f-a7cf-e115faccd21a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536658, 'tstamp': 536658}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225015, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.765 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a231fe-8eb2-4233-bcff-4ff5c3ec62ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536658, 'reachable_time': 31745, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225016, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.797 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8cae91b1-cde4-4f21-8baa-59bbb1e3f8c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.871 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96f57238-77ba-4a5d-ada8-8ca59e86271e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.873 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.873 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.873 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:03 np0005539504 NetworkManager[55210]: <info>  [1764399903.8760] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.875 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.878 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:03Z|00242|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=0)
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.882 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.883 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4da1326e-dfab-45f7-b364-110dada283ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.884 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:05:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:03.884 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:05:03 np0005539504 nova_compute[187152]: 2025-11-29 07:05:03.895 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.174 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399904.1739755, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.175 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.179 187156 DEBUG nova.compute.manager [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.183 187156 INFO nova.virt.libvirt.driver [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance running successfully.#033[00m
Nov 29 02:05:04 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.186 187156 DEBUG nova.virt.libvirt.guest [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.186 187156 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.331 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.332 187156 DEBUG nova.network.neutron [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updated VIF entry in instance network info cache for port 6a0ff3c3-e368-4504-9884-40716725c901. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.333 187156 DEBUG nova.network.neutron [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.338 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:04 np0005539504 podman[225053]: 2025-11-29 07:05:04.276340975 +0000 UTC m=+0.023400873 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:05:04 np0005539504 podman[225053]: 2025-11-29 07:05:04.473401349 +0000 UTC m=+0.220461227 container create 642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.503 187156 DEBUG oslo_concurrency.lockutils [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.504 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.505 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399904.1780016, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.505 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Started (Lifecycle Event)#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.612 187156 DEBUG nova.compute.manager [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.613 187156 DEBUG oslo_concurrency.lockutils [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.613 187156 DEBUG oslo_concurrency.lockutils [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.613 187156 DEBUG oslo_concurrency.lockutils [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.614 187156 DEBUG nova.compute.manager [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.614 187156 WARNING nova.compute.manager [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 29 02:05:04 np0005539504 systemd[1]: Started libpod-conmon-642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31.scope.
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.637 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.642 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:04 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:05:04 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67d49ebd02f26b67d16487cb47a60427211ed78f9160a81ca14b45753930afe1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:05:04 np0005539504 podman[225053]: 2025-11-29 07:05:04.681151802 +0000 UTC m=+0.428211700 container init 642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:05:04 np0005539504 podman[225053]: 2025-11-29 07:05:04.688526261 +0000 UTC m=+0.435586139 container start 642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:05:04 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [NOTICE]   (225072) : New worker (225074) forked
Nov 29 02:05:04 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [NOTICE]   (225072) : Loading success.
Nov 29 02:05:04 np0005539504 nova_compute[187152]: 2025-11-29 07:05:04.803 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:06 np0005539504 nova_compute[187152]: 2025-11-29 07:05:06.724 187156 DEBUG nova.compute.manager [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:06 np0005539504 nova_compute[187152]: 2025-11-29 07:05:06.726 187156 DEBUG oslo_concurrency.lockutils [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:06 np0005539504 nova_compute[187152]: 2025-11-29 07:05:06.726 187156 DEBUG oslo_concurrency.lockutils [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:06 np0005539504 nova_compute[187152]: 2025-11-29 07:05:06.726 187156 DEBUG oslo_concurrency.lockutils [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:06 np0005539504 nova_compute[187152]: 2025-11-29 07:05:06.727 187156 DEBUG nova.compute.manager [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:06 np0005539504 nova_compute[187152]: 2025-11-29 07:05:06.727 187156 WARNING nova.compute.manager [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state resized and task_state deleting.#033[00m
Nov 29 02:05:07 np0005539504 podman[225084]: 2025-11-29 07:05:07.726544299 +0000 UTC m=+0.064460433 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:05:08 np0005539504 nova_compute[187152]: 2025-11-29 07:05:08.073 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:09 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:05:09 np0005539504 systemd[224919]: Activating special unit Exit the Session...
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped target Main User Target.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped target Basic System.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped target Paths.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped target Sockets.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped target Timers.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:05:09 np0005539504 systemd[224919]: Closed D-Bus User Message Bus Socket.
Nov 29 02:05:09 np0005539504 systemd[224919]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:05:09 np0005539504 systemd[224919]: Removed slice User Application Slice.
Nov 29 02:05:09 np0005539504 systemd[224919]: Reached target Shutdown.
Nov 29 02:05:09 np0005539504 systemd[224919]: Finished Exit the Session.
Nov 29 02:05:09 np0005539504 systemd[224919]: Reached target Exit the Session.
Nov 29 02:05:09 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:05:09 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:05:09 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:05:09 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:05:09 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:05:09 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:05:09 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:05:09 np0005539504 nova_compute[187152]: 2025-11-29 07:05:09.806 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.860 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.861 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.862 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.863 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.863 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.878 187156 INFO nova.compute.manager [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Terminating instance#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.897 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "refresh_cache-1c73b1b4-2bab-4451-843e-0f70db66eb9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.898 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquired lock "refresh_cache-1c73b1b4-2bab-4451-843e-0f70db66eb9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:10 np0005539504 nova_compute[187152]: 2025-11-29 07:05:10.898 187156 DEBUG nova.network.neutron [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:11 np0005539504 nova_compute[187152]: 2025-11-29 07:05:11.074 187156 DEBUG nova.network.neutron [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:11 np0005539504 nova_compute[187152]: 2025-11-29 07:05:11.263 187156 DEBUG nova.network.neutron [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:11 np0005539504 nova_compute[187152]: 2025-11-29 07:05:11.281 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Releasing lock "refresh_cache-1c73b1b4-2bab-4451-843e-0f70db66eb9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:11 np0005539504 nova_compute[187152]: 2025-11-29 07:05:11.282 187156 DEBUG nova.compute.manager [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:11 np0005539504 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000043.scope: Deactivated successfully.
Nov 29 02:05:11 np0005539504 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000043.scope: Consumed 14.347s CPU time.
Nov 29 02:05:11 np0005539504 systemd-machined[153423]: Machine qemu-34-instance-00000043 terminated.
Nov 29 02:05:11 np0005539504 podman[225105]: 2025-11-29 07:05:11.421656079 +0000 UTC m=+0.094618287 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:05:11 np0005539504 podman[225106]: 2025-11-29 07:05:11.433044207 +0000 UTC m=+0.082303535 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:05:11 np0005539504 nova_compute[187152]: 2025-11-29 07:05:11.540 187156 INFO nova.virt.libvirt.driver [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Instance destroyed successfully.#033[00m
Nov 29 02:05:11 np0005539504 nova_compute[187152]: 2025-11-29 07:05:11.540 187156 DEBUG nova.objects.instance [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'resources' on Instance uuid 1c73b1b4-2bab-4451-843e-0f70db66eb9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:13 np0005539504 nova_compute[187152]: 2025-11-29 07:05:13.076 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:14 np0005539504 nova_compute[187152]: 2025-11-29 07:05:14.044 187156 INFO nova.virt.libvirt.driver [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Deleting instance files /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b_del#033[00m
Nov 29 02:05:14 np0005539504 nova_compute[187152]: 2025-11-29 07:05:14.046 187156 INFO nova.virt.libvirt.driver [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Deletion of /var/lib/nova/instances/1c73b1b4-2bab-4451-843e-0f70db66eb9b_del complete#033[00m
Nov 29 02:05:14 np0005539504 nova_compute[187152]: 2025-11-29 07:05:14.807 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.001 187156 INFO nova.compute.manager [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Took 3.72 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.002 187156 DEBUG oslo.service.loopingcall [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.002 187156 DEBUG nova.compute.manager [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.002 187156 DEBUG nova.network.neutron [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.147 187156 DEBUG nova.network.neutron [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.369 187156 DEBUG nova.network.neutron [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:15 np0005539504 nova_compute[187152]: 2025-11-29 07:05:15.476 187156 INFO nova.compute.manager [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Took 0.47 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.297 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.298 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.298 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.299 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.299 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:16 np0005539504 podman[225173]: 2025-11-29 07:05:16.722973764 +0000 UTC m=+0.060733651 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:05:16 np0005539504 podman[225174]: 2025-11-29 07:05:16.757814466 +0000 UTC m=+0.090690001 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.925 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399901.9236279, f4b6bd8b-65a8-48e2-85d9-eb1526069b38 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:16 np0005539504 nova_compute[187152]: 2025-11-29 07:05:16.926 187156 INFO nova.compute.manager [-] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.043 187156 DEBUG nova.compute.manager [None req-a4d921cc-f265-4cfb-9630-ec1dd255b84f - - - - - -] [instance: f4b6bd8b-65a8-48e2-85d9-eb1526069b38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.182 187156 INFO nova.compute.manager [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Terminating instance#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.203 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.204 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.289 187156 DEBUG nova.compute.provider_tree [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.312 187156 DEBUG nova.compute.manager [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.319 187156 DEBUG nova.scheduler.client.report [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:17 np0005539504 kernel: tap6a0ff3c3-e3 (unregistering): left promiscuous mode
Nov 29 02:05:17 np0005539504 NetworkManager[55210]: <info>  [1764399917.3362] device (tap6a0ff3c3-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.370 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:17Z|00243|binding|INFO|Releasing lport 6a0ff3c3-e368-4504-9884-40716725c901 from this chassis (sb_readonly=0)
Nov 29 02:05:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:17Z|00244|binding|INFO|Setting lport 6a0ff3c3-e368-4504-9884-40716725c901 down in Southbound
Nov 29 02:05:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:17Z|00245|binding|INFO|Removing iface tap6a0ff3c3-e3 ovn-installed in OVS
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.372 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.384 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:17 np0005539504 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Nov 29 02:05:17 np0005539504 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000003f.scope: Consumed 13.089s CPU time.
Nov 29 02:05:17 np0005539504 systemd-machined[153423]: Machine qemu-37-instance-0000003f terminated.
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.597 187156 INFO nova.virt.libvirt.driver [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance destroyed successfully.#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.598 187156 DEBUG nova.objects.instance [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:17 np0005539504 nova_compute[187152]: 2025-11-29 07:05:17.959 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.020 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:c1:31 10.100.0.14'], port_security=['fa:16:3e:15:c1:31 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '8', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6a0ff3c3-e368-4504-9884-40716725c901) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.021 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6a0ff3c3-e368-4504-9884-40716725c901 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.022 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.025 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[083ef83e-1b9d-4468-b017-2e9ffae9361d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.026 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.041 187156 INFO nova.scheduler.client.report [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Deleted allocations for instance 1c73b1b4-2bab-4451-843e-0f70db66eb9b#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.070 187156 DEBUG nova.virt.libvirt.vif [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:05:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:04Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.070 187156 DEBUG nova.network.os_vif_util [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.071 187156 DEBUG nova.network.os_vif_util [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.071 187156 DEBUG os_vif [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.074 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.074 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a0ff3c3-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.078 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.084 187156 INFO os_vif [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3')#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.085 187156 INFO nova.virt.libvirt.driver [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Deleting instance files /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_del#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.092 187156 INFO nova.virt.libvirt.driver [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Deletion of /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_del complete#033[00m
Nov 29 02:05:18 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [NOTICE]   (225072) : haproxy version is 2.8.14-c23fe91
Nov 29 02:05:18 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [NOTICE]   (225072) : path to executable is /usr/sbin/haproxy
Nov 29 02:05:18 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [WARNING]  (225072) : Exiting Master process...
Nov 29 02:05:18 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [WARNING]  (225072) : Exiting Master process...
Nov 29 02:05:18 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [ALERT]    (225072) : Current worker (225074) exited with code 143 (Terminated)
Nov 29 02:05:18 np0005539504 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[225068]: [WARNING]  (225072) : All workers exited. Exiting... (0)
Nov 29 02:05:18 np0005539504 systemd[1]: libpod-642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31.scope: Deactivated successfully.
Nov 29 02:05:18 np0005539504 podman[225262]: 2025-11-29 07:05:18.171899771 +0000 UTC m=+0.048118991 container died 642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.201 187156 DEBUG oslo_concurrency.lockutils [None req-00cc7d83-24e0-4742-b3b7-e7eed03cb9c8 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "1c73b1b4-2bab-4451-843e-0f70db66eb9b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:18 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31-userdata-shm.mount: Deactivated successfully.
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.206 187156 INFO nova.compute.manager [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.207 187156 DEBUG oslo.service.loopingcall [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:18 np0005539504 systemd[1]: var-lib-containers-storage-overlay-67d49ebd02f26b67d16487cb47a60427211ed78f9160a81ca14b45753930afe1-merged.mount: Deactivated successfully.
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.207 187156 DEBUG nova.compute.manager [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.207 187156 DEBUG nova.network.neutron [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:18 np0005539504 podman[225262]: 2025-11-29 07:05:18.226138666 +0000 UTC m=+0.102357876 container cleanup 642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:05:18 np0005539504 systemd[1]: libpod-conmon-642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31.scope: Deactivated successfully.
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.721 187156 DEBUG nova.compute.manager [req-103aeac0-aa15-4300-8fcb-a8b60b57c600 req-084d56cb-f8af-4635-9777-b7e743241738 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.721 187156 DEBUG oslo_concurrency.lockutils [req-103aeac0-aa15-4300-8fcb-a8b60b57c600 req-084d56cb-f8af-4635-9777-b7e743241738 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.721 187156 DEBUG oslo_concurrency.lockutils [req-103aeac0-aa15-4300-8fcb-a8b60b57c600 req-084d56cb-f8af-4635-9777-b7e743241738 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.722 187156 DEBUG oslo_concurrency.lockutils [req-103aeac0-aa15-4300-8fcb-a8b60b57c600 req-084d56cb-f8af-4635-9777-b7e743241738 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.722 187156 DEBUG nova.compute.manager [req-103aeac0-aa15-4300-8fcb-a8b60b57c600 req-084d56cb-f8af-4635-9777-b7e743241738 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.722 187156 WARNING nova.compute.manager [req-103aeac0-aa15-4300-8fcb-a8b60b57c600 req-084d56cb-f8af-4635-9777-b7e743241738 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:05:18 np0005539504 podman[225293]: 2025-11-29 07:05:18.755820886 +0000 UTC m=+0.511009697 container remove 642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.763 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2d89c4-5558-45d6-a69c-d4ef81439f33]: (4, ('Sat Nov 29 07:05:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31)\n642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31\nSat Nov 29 07:05:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31)\n642a62e692318d7466907bbda44e7eec18f879d647f076635f2ba31e4fe08d31\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.765 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a84f5018-e9ae-4958-9d59-22877e280cd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.766 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:18 np0005539504 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.768 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:18 np0005539504 nova_compute[187152]: 2025-11-29 07:05:18.780 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.783 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[24216f99-3530-46b6-a255-5edba8d80ec2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.797 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2659b6b8-d93d-450a-8567-ffa926f1a75d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.799 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f17071-1c15-49c0-be0b-2750a84198fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.812 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8e0dd1-4659-459f-808b-c6ea943e6d5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536651, 'reachable_time': 22193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225309, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:18 np0005539504 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.817 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:05:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:18.817 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[51acf612-3c71-4033-bb41-7528464be0c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.028 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.028 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.059 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.180 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.181 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.190 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.191 187156 INFO nova.compute.claims [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.340 187156 DEBUG nova.compute.provider_tree [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.356 187156 DEBUG nova.scheduler.client.report [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.386 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.387 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.407 187156 DEBUG nova.network.neutron [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.620 187156 DEBUG nova.compute.manager [req-79255c9b-a84d-4a90-86f5-9b76c92b205c req-ec585853-3f90-4e6d-a69e-604a4c5658b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-deleted-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.621 187156 INFO nova.compute.manager [req-79255c9b-a84d-4a90-86f5-9b76c92b205c req-ec585853-3f90-4e6d-a69e-604a4c5658b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Neutron deleted interface 6a0ff3c3-e368-4504-9884-40716725c901; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.621 187156 DEBUG nova.network.neutron [req-79255c9b-a84d-4a90-86f5-9b76c92b205c req-ec585853-3f90-4e6d-a69e-604a4c5658b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.634 187156 INFO nova.compute.manager [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.644 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.644 187156 DEBUG nova.network.neutron [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.650 187156 DEBUG nova.compute.manager [req-79255c9b-a84d-4a90-86f5-9b76c92b205c req-ec585853-3f90-4e6d-a69e-604a4c5658b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Detach interface failed, port_id=6a0ff3c3-e368-4504-9884-40716725c901, reason: Instance 690daf8f-6151-4de9-85f6-b8a9fe51ea02 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.737 187156 INFO nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.766 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.804 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.805 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.809 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:19 np0005539504 nova_compute[187152]: 2025-11-29 07:05:19.812 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.200 187156 DEBUG nova.policy [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.206 187156 INFO nova.scheduler.client.report [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocations for instance 690daf8f-6151-4de9-85f6-b8a9fe51ea02#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.296 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.298 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.299 187156 INFO nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Creating image(s)#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.299 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.299 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.300 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.318 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.355 187156 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.383 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.384 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.384 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.396 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.456 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.458 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.500 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.501 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.501 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.568 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.569 187156 DEBUG nova.virt.disk.api [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.570 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.634 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.636 187156 DEBUG nova.virt.disk.api [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.636 187156 DEBUG nova.objects.instance [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid 8bf08ec4-e207-48d0-a8cc-45ffed50be77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.667 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.667 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Ensure instance console log exists: /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.668 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.668 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:20 np0005539504 nova_compute[187152]: 2025-11-29 07:05:20.669 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.111 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.112 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.113 187156 INFO nova.compute.manager [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Rebooting instance#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.197 187156 DEBUG nova.compute.manager [req-be7b0e9c-1dfc-4266-9a6f-01cf2f442180 req-86b685b6-80bf-49ea-ad6e-c072376089f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.198 187156 DEBUG oslo_concurrency.lockutils [req-be7b0e9c-1dfc-4266-9a6f-01cf2f442180 req-86b685b6-80bf-49ea-ad6e-c072376089f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.198 187156 DEBUG oslo_concurrency.lockutils [req-be7b0e9c-1dfc-4266-9a6f-01cf2f442180 req-86b685b6-80bf-49ea-ad6e-c072376089f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.198 187156 DEBUG oslo_concurrency.lockutils [req-be7b0e9c-1dfc-4266-9a6f-01cf2f442180 req-86b685b6-80bf-49ea-ad6e-c072376089f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.199 187156 DEBUG nova.compute.manager [req-be7b0e9c-1dfc-4266-9a6f-01cf2f442180 req-86b685b6-80bf-49ea-ad6e-c072376089f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.199 187156 WARNING nova.compute.manager [req-be7b0e9c-1dfc-4266-9a6f-01cf2f442180 req-86b685b6-80bf-49ea-ad6e-c072376089f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.257 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.258 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:21 np0005539504 nova_compute[187152]: 2025-11-29 07:05:21.258 187156 DEBUG nova.network.neutron [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:22 np0005539504 nova_compute[187152]: 2025-11-29 07:05:22.461 187156 DEBUG nova.network.neutron [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Successfully created port: 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:05:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:22.920 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:22.921 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:22.922 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:23 np0005539504 nova_compute[187152]: 2025-11-29 07:05:23.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:24 np0005539504 nova_compute[187152]: 2025-11-29 07:05:24.559 187156 DEBUG nova.network.neutron [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:24 np0005539504 nova_compute[187152]: 2025-11-29 07:05:24.578 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:24 np0005539504 nova_compute[187152]: 2025-11-29 07:05:24.600 187156 DEBUG nova.compute.manager [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:24 np0005539504 podman[225326]: 2025-11-29 07:05:24.744736647 +0000 UTC m=+0.073868986 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:05:24 np0005539504 kernel: tapb7078e73-f0 (unregistering): left promiscuous mode
Nov 29 02:05:24 np0005539504 NetworkManager[55210]: <info>  [1764399924.7925] device (tapb7078e73-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:05:24 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:24Z|00246|binding|INFO|Releasing lport b7078e73-f0e3-441a-843e-8920e38aec30 from this chassis (sb_readonly=0)
Nov 29 02:05:24 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:24Z|00247|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 down in Southbound
Nov 29 02:05:24 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:24Z|00248|binding|INFO|Removing iface tapb7078e73-f0 ovn-installed in OVS
Nov 29 02:05:24 np0005539504 nova_compute[187152]: 2025-11-29 07:05:24.797 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:24.811 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:24.812 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:05:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:24.814 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:05:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:24.815 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5c68364a-0a91-4ab6-8123-0fd8e58f43fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:24.815 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:05:24 np0005539504 nova_compute[187152]: 2025-11-29 07:05:24.826 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:24 np0005539504 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000042.scope: Deactivated successfully.
Nov 29 02:05:24 np0005539504 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000042.scope: Consumed 15.139s CPU time.
Nov 29 02:05:24 np0005539504 systemd-machined[153423]: Machine qemu-35-instance-00000042 terminated.
Nov 29 02:05:24 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [NOTICE]   (224715) : haproxy version is 2.8.14-c23fe91
Nov 29 02:05:24 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [NOTICE]   (224715) : path to executable is /usr/sbin/haproxy
Nov 29 02:05:24 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [WARNING]  (224715) : Exiting Master process...
Nov 29 02:05:24 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [WARNING]  (224715) : Exiting Master process...
Nov 29 02:05:24 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [ALERT]    (224715) : Current worker (224717) exited with code 143 (Terminated)
Nov 29 02:05:24 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[224711]: [WARNING]  (224715) : All workers exited. Exiting... (0)
Nov 29 02:05:24 np0005539504 systemd[1]: libpod-726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1.scope: Deactivated successfully.
Nov 29 02:05:24 np0005539504 podman[225372]: 2025-11-29 07:05:24.953854767 +0000 UTC m=+0.051430250 container died 726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:05:24 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1-userdata-shm.mount: Deactivated successfully.
Nov 29 02:05:24 np0005539504 systemd[1]: var-lib-containers-storage-overlay-ed3e84c827bf12763a0ef4000f9c3ea49e5320d0ab4626549ce23dfa12a6489f-merged.mount: Deactivated successfully.
Nov 29 02:05:24 np0005539504 podman[225372]: 2025-11-29 07:05:24.990711903 +0000 UTC m=+0.088287346 container cleanup 726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:05:25 np0005539504 systemd[1]: libpod-conmon-726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1.scope: Deactivated successfully.
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.020 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance destroyed successfully.#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.020 187156 DEBUG nova.objects.instance [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.031 187156 DEBUG nova.virt.libvirt.vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.032 187156 DEBUG nova.network.os_vif_util [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.033 187156 DEBUG nova.network.os_vif_util [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.033 187156 DEBUG os_vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.035 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.035 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7078e73-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.037 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.043 187156 INFO os_vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.052 187156 DEBUG nova.virt.libvirt.driver [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start _get_guest_xml network_info=[{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:05:25 np0005539504 podman[225414]: 2025-11-29 07:05:25.055953756 +0000 UTC m=+0.040704161 container remove 726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.056 187156 WARNING nova.virt.libvirt.driver [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.060 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2a11ca08-436c-4c27-9df8-5cd508eaeed3]: (4, ('Sat Nov 29 07:05:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1)\n726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1\nSat Nov 29 07:05:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1)\n726dee00560d1480ad623020266f4610110f946881293b5b291bf3afc47641b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.061 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[17094b64-da37-4843-ad70-05d8c4006bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.062 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.063 187156 DEBUG nova.virt.libvirt.host [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.064 187156 DEBUG nova.virt.libvirt.host [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.064 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.070 187156 DEBUG nova.virt.libvirt.host [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.070 187156 DEBUG nova.virt.libvirt.host [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.072 187156 DEBUG nova.virt.libvirt.driver [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.072 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.072 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.072 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.073 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.073 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.073 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.073 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.074 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.074 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.074 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.074 187156 DEBUG nova.virt.hardware [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.075 187156 DEBUG nova.objects.instance [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.084 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac02681-ad99-46ab-913a-0a4d291d6bf0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.091 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.096 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6ccea42d-dc29-4b4f-9c00-e039981d3d5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.098 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3b8bf8-cc4b-4545-9748-0105dbfb35b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.113 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bb39580d-9b4b-49c5-ad76-84ae218517f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534354, 'reachable_time': 25283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225433, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.115 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.116 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1f5b1d-8ca6-4955-83e0-f36999ea090d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.146 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.147 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.147 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.148 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.149 187156 DEBUG nova.virt.libvirt.vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.150 187156 DEBUG nova.network.os_vif_util [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.150 187156 DEBUG nova.network.os_vif_util [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.151 187156 DEBUG nova.objects.instance [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.164 187156 DEBUG nova.virt.libvirt.driver [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <uuid>9223f44a-297e-4db1-9f44-ee0694c4e258</uuid>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <name>instance-00000042</name>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestJSON-server-664171356</nova:name>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:05:25</nova:creationTime>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        <nova:port uuid="b7078e73-f0e3-441a-843e-8920e38aec30">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <entry name="serial">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <entry name="uuid">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:1e:a3:23"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <target dev="tapb7078e73-f0"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log" append="off"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:05:25 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:05:25 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:05:25 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:05:25 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.164 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.234 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.235 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.305 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.307 187156 DEBUG nova.objects.instance [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.327 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.353 187156 DEBUG nova.network.neutron [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Successfully updated port: 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.373 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-8bf08ec4-e207-48d0-a8cc-45ffed50be77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.373 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-8bf08ec4-e207-48d0-a8cc-45ffed50be77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.373 187156 DEBUG nova.network.neutron [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.403 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.404 187156 DEBUG nova.virt.disk.api [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.404 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.457 187156 DEBUG oslo_concurrency.processutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.458 187156 DEBUG nova.virt.disk.api [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.459 187156 DEBUG nova.objects.instance [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.476 187156 DEBUG nova.virt.libvirt.vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.476 187156 DEBUG nova.network.os_vif_util [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.477 187156 DEBUG nova.network.os_vif_util [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.478 187156 DEBUG os_vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.478 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.479 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.479 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.482 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.482 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7078e73-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.483 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7078e73-f0, col_values=(('external_ids', {'iface-id': 'b7078e73-f0e3-441a-843e-8920e38aec30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a3:23', 'vm-uuid': '9223f44a-297e-4db1-9f44-ee0694c4e258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.4861] manager: (tapb7078e73-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.486 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.491 187156 DEBUG nova.compute.manager [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-changed-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.491 187156 DEBUG nova.compute.manager [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Refreshing instance network info cache due to event network-changed-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.491 187156 DEBUG oslo_concurrency.lockutils [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-8bf08ec4-e207-48d0-a8cc-45ffed50be77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.493 187156 INFO os_vif [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.510 187156 DEBUG nova.compute.manager [req-844970f7-099b-4c43-ab17-b8d7f446c316 req-579d1f4e-976f-4aff-8cf2-e51820af6ad2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.510 187156 DEBUG oslo_concurrency.lockutils [req-844970f7-099b-4c43-ab17-b8d7f446c316 req-579d1f4e-976f-4aff-8cf2-e51820af6ad2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.511 187156 DEBUG oslo_concurrency.lockutils [req-844970f7-099b-4c43-ab17-b8d7f446c316 req-579d1f4e-976f-4aff-8cf2-e51820af6ad2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.511 187156 DEBUG oslo_concurrency.lockutils [req-844970f7-099b-4c43-ab17-b8d7f446c316 req-579d1f4e-976f-4aff-8cf2-e51820af6ad2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.511 187156 DEBUG nova.compute.manager [req-844970f7-099b-4c43-ab17-b8d7f446c316 req-579d1f4e-976f-4aff-8cf2-e51820af6ad2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.511 187156 WARNING nova.compute.manager [req-844970f7-099b-4c43-ab17-b8d7f446c316 req-579d1f4e-976f-4aff-8cf2-e51820af6ad2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Nov 29 02:05:25 np0005539504 kernel: tapb7078e73-f0: entered promiscuous mode
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.5777] manager: (tapb7078e73-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/121)
Nov 29 02:05:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:25Z|00249|binding|INFO|Claiming lport b7078e73-f0e3-441a-843e-8920e38aec30 for this chassis.
Nov 29 02:05:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:25Z|00250|binding|INFO|b7078e73-f0e3-441a-843e-8920e38aec30: Claiming fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.579 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 systemd-udevd[225352]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.586 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.587 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.588 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.595 187156 DEBUG nova.network.neutron [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:25Z|00251|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 ovn-installed in OVS
Nov 29 02:05:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:25Z|00252|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 up in Southbound
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.599 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.6015] device (tapb7078e73-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.601 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[456b8c9d-c201-4def-92e8-212c55b25364]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.6035] device (tapb7078e73-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.603 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.605 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.605 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[56300707-a488-495a-97d5-65debdc925b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.606 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e33a7a-71fb-4434-a0e7-f5bb820a22dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.621 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf49803-4b79-434b-ad86-0ac5292fac6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 systemd-machined[153423]: New machine qemu-38-instance-00000042.
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.639 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c91661d6-43b3-4d7e-a566-78defee3d307]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 systemd[1]: Started Virtual Machine qemu-38-instance-00000042.
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.677 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1f2387-63c4-4d3a-9e10-8bba53cf3a43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.682 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c474f662-27e4-4df0-863d-6f17b32cf2f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.6835] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/122)
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.715 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[66620d44-7bb7-4247-b041-8f2089d89909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.718 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[19566846-1bf7-4c49-9647-c70ab5a9280c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.7452] device (tap9226dea3-60): carrier: link connected
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.752 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[417422bf-92d6-4e2a-bb48-f218f116634b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.773 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[46ad05ec-b06d-478a-b887-12a51b905b39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538862, 'reachable_time': 19944, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225495, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.791 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[99c4ef9e-784a-4be2-ab18-3cc407d738cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538862, 'tstamp': 538862}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225496, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.813 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fd237198-0b66-4db9-811a-90bf52dfb469]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538862, 'reachable_time': 19944, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225497, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.845 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f06e144c-2e26-4667-a5d2-70c99fc3bccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.903 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b5a237-102d-455c-beba-e5482f8bdd91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.907 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 NetworkManager[55210]: <info>  [1764399925.9081] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Nov 29 02:05:25 np0005539504 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.912 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:25Z|00253|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.914 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.916 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.917 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd4d368-6cea-4bf9-a2e5-0427306e089f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.917 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:05:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:25.918 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:05:25 np0005539504 nova_compute[187152]: 2025-11-29 07:05:25.925 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.262 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 9223f44a-297e-4db1-9f44-ee0694c4e258 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.263 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399926.2618608, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.263 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.266 187156 DEBUG nova.compute.manager [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.270 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance rebooted successfully.#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.271 187156 DEBUG nova.compute.manager [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.286 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.291 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:26 np0005539504 podman[225536]: 2025-11-29 07:05:26.324920719 +0000 UTC m=+0.050734432 container create c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.331 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.333 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399926.263314, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.333 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Started (Lifecycle Event)#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.361 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:26 np0005539504 systemd[1]: Started libpod-conmon-c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850.scope.
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.371 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.382 187156 DEBUG oslo_concurrency.lockutils [None req-40388363-64ce-4bbd-b2e3-6b9d8fed2837 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:26 np0005539504 podman[225536]: 2025-11-29 07:05:26.29572405 +0000 UTC m=+0.021537743 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:05:26 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:05:26 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9af01e05100a6f33db87fd5355e439aafcacc0cad2456bc557328ce46250aebb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:05:26 np0005539504 podman[225536]: 2025-11-29 07:05:26.419027071 +0000 UTC m=+0.144840794 container init c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:05:26 np0005539504 podman[225536]: 2025-11-29 07:05:26.425292211 +0000 UTC m=+0.151105904 container start c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 02:05:26 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [NOTICE]   (225555) : New worker (225557) forked
Nov 29 02:05:26 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [NOTICE]   (225555) : Loading success.
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.538 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399911.5365844, 1c73b1b4-2bab-4451-843e-0f70db66eb9b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.538 187156 INFO nova.compute.manager [-] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:26 np0005539504 nova_compute[187152]: 2025-11-29 07:05:26.860 187156 DEBUG nova.compute.manager [None req-33d677fa-b175-4650-a423-c38aae65b77c - - - - - -] [instance: 1c73b1b4-2bab-4451-843e-0f70db66eb9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:27 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:27Z|00254|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.594 187156 DEBUG nova.network.neutron [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Updating instance_info_cache with network_info: [{"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.618 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.633 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-8bf08ec4-e207-48d0-a8cc-45ffed50be77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.634 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Instance network_info: |[{"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.634 187156 DEBUG oslo_concurrency.lockutils [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-8bf08ec4-e207-48d0-a8cc-45ffed50be77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.634 187156 DEBUG nova.network.neutron [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Refreshing network info cache for port 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.638 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Start _get_guest_xml network_info=[{"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.644 187156 WARNING nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.653 187156 DEBUG nova.virt.libvirt.host [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.655 187156 DEBUG nova.virt.libvirt.host [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.662 187156 DEBUG nova.virt.libvirt.host [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.663 187156 DEBUG nova.virt.libvirt.host [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.664 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.664 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.665 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.665 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.665 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.665 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.666 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.666 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.666 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.666 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.666 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.667 187156 DEBUG nova.virt.hardware [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.670 187156 DEBUG nova.virt.libvirt.vif [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:05:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1328477640',display_name='tempest-ServersTestJSON-server-1328477640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1328477640',id=74,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-a1w58mp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:19Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=8bf08ec4-e207-48d0-a8cc-45ffed50be77,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.670 187156 DEBUG nova.network.os_vif_util [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.671 187156 DEBUG nova.network.os_vif_util [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.672 187156 DEBUG nova.objects.instance [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid 8bf08ec4-e207-48d0-a8cc-45ffed50be77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.690 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <uuid>8bf08ec4-e207-48d0-a8cc-45ffed50be77</uuid>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <name>instance-0000004a</name>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersTestJSON-server-1328477640</nova:name>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:05:27</nova:creationTime>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        <nova:port uuid="31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <entry name="serial">8bf08ec4-e207-48d0-a8cc-45ffed50be77</entry>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <entry name="uuid">8bf08ec4-e207-48d0-a8cc-45ffed50be77</entry>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.config"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:1c:6c:f2"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <target dev="tap31209c15-74"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/console.log" append="off"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:05:27 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:05:27 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:05:27 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:05:27 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.691 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Preparing to wait for external event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.692 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.692 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.692 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.693 187156 DEBUG nova.virt.libvirt.vif [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:05:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1328477640',display_name='tempest-ServersTestJSON-server-1328477640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1328477640',id=74,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-a1w58mp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:19Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=8bf08ec4-e207-48d0-a8cc-45ffed50be77,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.693 187156 DEBUG nova.network.os_vif_util [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.694 187156 DEBUG nova.network.os_vif_util [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.694 187156 DEBUG os_vif [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.695 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.695 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.695 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.699 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.699 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31209c15-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.700 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31209c15-74, col_values=(('external_ids', {'iface-id': '31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:6c:f2', 'vm-uuid': '8bf08ec4-e207-48d0-a8cc-45ffed50be77'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.701 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:27 np0005539504 NetworkManager[55210]: <info>  [1764399927.7032] manager: (tap31209c15-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.708 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.709 187156 INFO os_vif [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74')#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.726 187156 DEBUG nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.726 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.727 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.727 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.727 187156 DEBUG nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.727 187156 WARNING nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.728 187156 DEBUG nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.728 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.729 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.729 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.729 187156 DEBUG nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.730 187156 WARNING nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.730 187156 DEBUG nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.730 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.730 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.731 187156 DEBUG oslo_concurrency.lockutils [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.731 187156 DEBUG nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.731 187156 WARNING nova.compute.manager [req-c7b0e4fc-e5c6-4d2b-b66e-9d49b596cc65 req-e81fec55-3fcf-49d2-9a4b-192ddebefd1f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.769 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.769 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.769 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:1c:6c:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:05:27 np0005539504 nova_compute[187152]: 2025-11-29 07:05:27.770 187156 INFO nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Using config drive#033[00m
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.378 187156 INFO nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Creating config drive at /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.config#033[00m
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.383 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr3c0e_ur execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.511 187156 DEBUG oslo_concurrency.processutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr3c0e_ur" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:28 np0005539504 NetworkManager[55210]: <info>  [1764399928.6078] manager: (tap31209c15-74): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Nov 29 02:05:28 np0005539504 kernel: tap31209c15-74: entered promiscuous mode
Nov 29 02:05:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:28Z|00255|binding|INFO|Claiming lport 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b for this chassis.
Nov 29 02:05:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:28Z|00256|binding|INFO|31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b: Claiming fa:16:3e:1c:6c:f2 10.100.0.12
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.613 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.625 187156 INFO nova.compute.manager [None req-ef1439d0-fd87-40e4-a876-b4aed311eb8c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Get console output#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.627 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:6c:f2 10.100.0.12'], port_security=['fa:16:3e:1c:6c:f2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8bf08ec4-e207-48d0-a8cc-45ffed50be77', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.630 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b in datapath 9cf3a513-f54e-430e-b018-befaa643b464 bound to our chassis#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.634 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464#033[00m
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:28Z|00257|binding|INFO|Setting lport 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b ovn-installed in OVS
Nov 29 02:05:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:28Z|00258|binding|INFO|Setting lport 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b up in Southbound
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.656 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce02c5da-6e5d-4915-b5bb-9708ac729cf0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.660 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.666 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.666 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50a9a1d6-1d43-458f-b0b7-3e14a0a1b19e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.672 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c54ee31e-f37b-409b-897b-88d73b670aa7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 systemd-machined[153423]: New machine qemu-39-instance-0000004a.
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.694 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebea936-a468-40ac-8a03-919c36120aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 systemd[1]: Started Virtual Machine qemu-39-instance-0000004a.
Nov 29 02:05:28 np0005539504 podman[225580]: 2025-11-29 07:05:28.711279561 +0000 UTC m=+0.113145907 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.711 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9923575b-4332-49f4-8acf-c6b7caf314f9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 systemd-udevd[225609]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:28 np0005539504 NetworkManager[55210]: <info>  [1764399928.7398] device (tap31209c15-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:05:28 np0005539504 NetworkManager[55210]: <info>  [1764399928.7409] device (tap31209c15-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.753 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7437d2ff-75c2-46fb-8c76-91a2fc0f5907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.753 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:05:28 np0005539504 NetworkManager[55210]: <info>  [1764399928.7610] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/126)
Nov 29 02:05:28 np0005539504 systemd-udevd[225615]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.760 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b39e016b-d4a9-40a9-a4ca-4578f08ac033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.802 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d236be20-7373-4eed-8a6d-fa2218b3aed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.805 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[75b38344-9694-4037-8ac0-8622a3a3b832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 NetworkManager[55210]: <info>  [1764399928.8316] device (tap9cf3a513-f0): carrier: link connected
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.837 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[30730d4b-6162-4237-83c4-1bc82deb47d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.855 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[634be2c4-7f08-40f9-943c-ca91b36a27e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539171, 'reachable_time': 35935, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225640, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.870 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d2b5cf-2b7a-4b0e-aaf1-94a80b86d418]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539171, 'tstamp': 539171}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225641, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.886 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[709395e0-f995-4206-a7e1-bd664554b85c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539171, 'reachable_time': 35935, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225642, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.922 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2632aded-6cc8-41d7-81bb-88a638b1f61a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.986 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[98795283-7874-428e-9e80-b05d9f37d3b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.988 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.988 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.988 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:28 np0005539504 NetworkManager[55210]: <info>  [1764399928.9908] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Nov 29 02:05:28 np0005539504 nova_compute[187152]: 2025-11-29 07:05:28.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:28 np0005539504 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 02:05:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:28.992 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:28Z|00259|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:29.009 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:29.009 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[74b83ded-1e5d-4068-9811-d72ac650457d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:29.010 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:05:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:29.010 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:29 np0005539504 podman[225673]: 2025-11-29 07:05:29.402764002 +0000 UTC m=+0.061643106 container create 8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:05:29 np0005539504 systemd[1]: Started libpod-conmon-8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d.scope.
Nov 29 02:05:29 np0005539504 podman[225673]: 2025-11-29 07:05:29.369833643 +0000 UTC m=+0.028712767 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:05:29 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:05:29 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61cfa9f0ac7cb12cdc75f3ddd307cf3934718b85a6a987edfb4a177b8eb56f5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:05:29 np0005539504 podman[225673]: 2025-11-29 07:05:29.49780795 +0000 UTC m=+0.156687064 container init 8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:05:29 np0005539504 podman[225673]: 2025-11-29 07:05:29.505983781 +0000 UTC m=+0.164862875 container start 8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:05:29 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [NOTICE]   (225692) : New worker (225694) forked
Nov 29 02:05:29 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [NOTICE]   (225692) : Loading success.
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.544 187156 DEBUG nova.network.neutron [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Updated VIF entry in instance network info cache for port 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.546 187156 DEBUG nova.network.neutron [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Updating instance_info_cache with network_info: [{"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.564 187156 DEBUG oslo_concurrency.lockutils [req-cdb6311f-efba-48de-b855-d2c533ba14b0 req-c3a5fe9e-d0ab-4496-8f12-5f77245a1e1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-8bf08ec4-e207-48d0-a8cc-45ffed50be77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.729 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399929.728082, 8bf08ec4-e207-48d0-a8cc-45ffed50be77 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.730 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] VM Started (Lifecycle Event)#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.755 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.760 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399929.7284758, 8bf08ec4-e207-48d0-a8cc-45ffed50be77 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.761 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.783 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.789 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.815 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.826 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.985 187156 DEBUG nova.compute.manager [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.987 187156 DEBUG oslo_concurrency.lockutils [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.987 187156 DEBUG oslo_concurrency.lockutils [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.987 187156 DEBUG oslo_concurrency.lockutils [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.988 187156 DEBUG nova.compute.manager [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Processing event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.988 187156 DEBUG nova.compute.manager [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.988 187156 DEBUG oslo_concurrency.lockutils [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.989 187156 DEBUG oslo_concurrency.lockutils [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.989 187156 DEBUG oslo_concurrency.lockutils [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.989 187156 DEBUG nova.compute.manager [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] No waiting events found dispatching network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.990 187156 WARNING nova.compute.manager [req-b54d3df0-38fe-42f7-93b2-33942ebd6765 req-a6144509-4fa1-4106-ac12-4c7bcddc1d7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received unexpected event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.990 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.994 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399929.9946063, 8bf08ec4-e207-48d0-a8cc-45ffed50be77 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.995 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:05:29 np0005539504 nova_compute[187152]: 2025-11-29 07:05:29.997 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.001 187156 INFO nova.virt.libvirt.driver [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Instance spawned successfully.#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.002 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.024 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.032 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.035 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.036 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.036 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.037 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.037 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.038 187156 DEBUG nova.virt.libvirt.driver [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.062 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.111 187156 INFO nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Took 9.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.112 187156 DEBUG nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.218 187156 INFO nova.compute.manager [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Took 11.07 seconds to build instance.#033[00m
Nov 29 02:05:30 np0005539504 nova_compute[187152]: 2025-11-29 07:05:30.241 187156 DEBUG oslo_concurrency.lockutils [None req-85023211-5ec8-413e-8fc1-72a266edd06b f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:32 np0005539504 nova_compute[187152]: 2025-11-29 07:05:32.595 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399917.5948043, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:32 np0005539504 nova_compute[187152]: 2025-11-29 07:05:32.596 187156 INFO nova.compute.manager [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:32 np0005539504 nova_compute[187152]: 2025-11-29 07:05:32.619 187156 DEBUG nova.compute.manager [None req-43743acd-35d1-4310-a234-7821d4c0be96 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:32 np0005539504 nova_compute[187152]: 2025-11-29 07:05:32.702 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.541 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.542 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.543 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.543 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.543 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.562 187156 INFO nova.compute.manager [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Terminating instance#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.582 187156 DEBUG nova.compute.manager [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:33 np0005539504 kernel: tap31209c15-74 (unregistering): left promiscuous mode
Nov 29 02:05:33 np0005539504 NetworkManager[55210]: <info>  [1764399933.6111] device (tap31209c15-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.620 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:33Z|00260|binding|INFO|Releasing lport 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b from this chassis (sb_readonly=0)
Nov 29 02:05:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:33Z|00261|binding|INFO|Setting lport 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b down in Southbound
Nov 29 02:05:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:33Z|00262|binding|INFO|Removing iface tap31209c15-74 ovn-installed in OVS
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.624 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.631 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:6c:f2 10.100.0.12'], port_security=['fa:16:3e:1c:6c:f2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '8bf08ec4-e207-48d0-a8cc-45ffed50be77', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.632 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.634 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.635 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7a98f9a3-f472-4efb-9802-3f0db24d51c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.636 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Nov 29 02:05:33 np0005539504 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004a.scope: Consumed 4.614s CPU time.
Nov 29 02:05:33 np0005539504 systemd-machined[153423]: Machine qemu-39-instance-0000004a terminated.
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.788 187156 DEBUG oslo_concurrency.lockutils [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.789 187156 DEBUG oslo_concurrency.lockutils [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.789 187156 DEBUG nova.compute.manager [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:33 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [NOTICE]   (225692) : haproxy version is 2.8.14-c23fe91
Nov 29 02:05:33 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [NOTICE]   (225692) : path to executable is /usr/sbin/haproxy
Nov 29 02:05:33 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [WARNING]  (225692) : Exiting Master process...
Nov 29 02:05:33 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [ALERT]    (225692) : Current worker (225694) exited with code 143 (Terminated)
Nov 29 02:05:33 np0005539504 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225688]: [WARNING]  (225692) : All workers exited. Exiting... (0)
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.793 187156 DEBUG nova.compute.manager [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.794 187156 DEBUG nova.objects.instance [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:33 np0005539504 systemd[1]: libpod-8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d.scope: Deactivated successfully.
Nov 29 02:05:33 np0005539504 podman[225733]: 2025-11-29 07:05:33.801601296 +0000 UTC m=+0.047715971 container died 8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:05:33 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d-userdata-shm.mount: Deactivated successfully.
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.833 187156 DEBUG nova.objects.instance [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'info_cache' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:33 np0005539504 systemd[1]: var-lib-containers-storage-overlay-61cfa9f0ac7cb12cdc75f3ddd307cf3934718b85a6a987edfb4a177b8eb56f5f-merged.mount: Deactivated successfully.
Nov 29 02:05:33 np0005539504 podman[225733]: 2025-11-29 07:05:33.846794506 +0000 UTC m=+0.092909181 container cleanup 8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.847 187156 INFO nova.virt.libvirt.driver [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Instance destroyed successfully.#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.847 187156 DEBUG nova.objects.instance [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid 8bf08ec4-e207-48d0-a8cc-45ffed50be77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.851 187156 DEBUG nova.compute.manager [req-767b33d9-6b1e-40f6-b0f9-11dcd1039b2e req-0d5ffb65-efb8-4e27-9c71-4e04b988a5b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-vif-unplugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.851 187156 DEBUG oslo_concurrency.lockutils [req-767b33d9-6b1e-40f6-b0f9-11dcd1039b2e req-0d5ffb65-efb8-4e27-9c71-4e04b988a5b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.851 187156 DEBUG oslo_concurrency.lockutils [req-767b33d9-6b1e-40f6-b0f9-11dcd1039b2e req-0d5ffb65-efb8-4e27-9c71-4e04b988a5b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.852 187156 DEBUG oslo_concurrency.lockutils [req-767b33d9-6b1e-40f6-b0f9-11dcd1039b2e req-0d5ffb65-efb8-4e27-9c71-4e04b988a5b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.852 187156 DEBUG nova.compute.manager [req-767b33d9-6b1e-40f6-b0f9-11dcd1039b2e req-0d5ffb65-efb8-4e27-9c71-4e04b988a5b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] No waiting events found dispatching network-vif-unplugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.852 187156 DEBUG nova.compute.manager [req-767b33d9-6b1e-40f6-b0f9-11dcd1039b2e req-0d5ffb65-efb8-4e27-9c71-4e04b988a5b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-vif-unplugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:05:33 np0005539504 systemd[1]: libpod-conmon-8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d.scope: Deactivated successfully.
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.862 187156 DEBUG nova.virt.libvirt.vif [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:05:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1328477640',display_name='tempest-ServersTestJSON-server-1328477640',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstestjson-server-1328477640',id=74,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:05:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-a1w58mp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:30Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=8bf08ec4-e207-48d0-a8cc-45ffed50be77,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.863 187156 DEBUG nova.network.os_vif_util [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "address": "fa:16:3e:1c:6c:f2", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31209c15-74", "ovs_interfaceid": "31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.863 187156 DEBUG nova.network.os_vif_util [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.864 187156 DEBUG os_vif [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.867 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31209c15-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.872 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.874 187156 INFO os_vif [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:6c:f2,bridge_name='br-int',has_traffic_filtering=True,id=31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31209c15-74')#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.875 187156 INFO nova.virt.libvirt.driver [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Deleting instance files /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77_del#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.876 187156 INFO nova.virt.libvirt.driver [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Deletion of /var/lib/nova/instances/8bf08ec4-e207-48d0-a8cc-45ffed50be77_del complete#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.879 187156 DEBUG nova.virt.libvirt.driver [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:05:33 np0005539504 podman[225778]: 2025-11-29 07:05:33.914877176 +0000 UTC m=+0.045234333 container remove 8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.921 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a14db24f-3fc9-4d0e-a91e-7b14f950ce38]: (4, ('Sat Nov 29 07:05:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d)\n8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d\nSat Nov 29 07:05:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d)\n8e5144a038581a38f8a62b5f085fbdc345cefc68bf11c74ded8a7bf871d96e5d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.923 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a300d4-ae14-4dea-b07d-9288d0a69960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.924 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:33 np0005539504 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.928 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.940 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.943 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1d056dba-ecab-41c3-ac4e-3830d62d4cf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.954 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b64370c4-5395-45bd-9a2d-e50c3dac0693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.956 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[da99e990-48d1-4412-b87d-fe2e4c7d3bfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.956 187156 INFO nova.compute.manager [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.956 187156 DEBUG oslo.service.loopingcall [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.956 187156 DEBUG nova.compute.manager [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:33 np0005539504 nova_compute[187152]: 2025-11-29 07:05:33.957 187156 DEBUG nova.network.neutron [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.969 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96aadc2d-7d0c-4e72-bbe7-ec5c1e3cb5dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539162, 'reachable_time': 31730, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225789, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:33 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.972 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:05:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:33.972 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[87de80d2-2c54-4236-8543-2d56541df6aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:34 np0005539504 nova_compute[187152]: 2025-11-29 07:05:34.830 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.247 187156 DEBUG nova.network.neutron [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.263 187156 INFO nova.compute.manager [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Took 1.31 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.466 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.467 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.486 187156 DEBUG nova.compute.manager [req-ef9dae31-8cbc-4a5f-9975-888a521284cd req-2ad326a2-9382-4e06-9f29-bd44fcac3bd3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-vif-deleted-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.558 187156 DEBUG nova.compute.provider_tree [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.575 187156 DEBUG nova.scheduler.client.report [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.602 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.640 187156 INFO nova.scheduler.client.report [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance 8bf08ec4-e207-48d0-a8cc-45ffed50be77#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.722 187156 DEBUG oslo_concurrency.lockutils [None req-0aac67c1-1b29-4824-b401-f6550da34280 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.968 187156 DEBUG nova.compute.manager [req-29bffaf6-2384-4bdb-9b16-6b2df2aafecd req-104dfab4-cd29-44d5-ba5f-94d8f59d6193 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.969 187156 DEBUG oslo_concurrency.lockutils [req-29bffaf6-2384-4bdb-9b16-6b2df2aafecd req-104dfab4-cd29-44d5-ba5f-94d8f59d6193 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.969 187156 DEBUG oslo_concurrency.lockutils [req-29bffaf6-2384-4bdb-9b16-6b2df2aafecd req-104dfab4-cd29-44d5-ba5f-94d8f59d6193 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.969 187156 DEBUG oslo_concurrency.lockutils [req-29bffaf6-2384-4bdb-9b16-6b2df2aafecd req-104dfab4-cd29-44d5-ba5f-94d8f59d6193 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8bf08ec4-e207-48d0-a8cc-45ffed50be77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.970 187156 DEBUG nova.compute.manager [req-29bffaf6-2384-4bdb-9b16-6b2df2aafecd req-104dfab4-cd29-44d5-ba5f-94d8f59d6193 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] No waiting events found dispatching network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:35 np0005539504 nova_compute[187152]: 2025-11-29 07:05:35.970 187156 WARNING nova.compute.manager [req-29bffaf6-2384-4bdb-9b16-6b2df2aafecd req-104dfab4-cd29-44d5-ba5f-94d8f59d6193 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Received unexpected event network-vif-plugged-31209c15-7414-4ac0-9a3b-2b9d5f4bfe3b for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:05:36 np0005539504 nova_compute[187152]: 2025-11-29 07:05:36.124 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:36.370 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:36 np0005539504 nova_compute[187152]: 2025-11-29 07:05:36.370 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:36.371 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:05:38 np0005539504 podman[225800]: 2025-11-29 07:05:38.765124174 +0000 UTC m=+0.098865392 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:05:38 np0005539504 nova_compute[187152]: 2025-11-29 07:05:38.870 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:39 np0005539504 nova_compute[187152]: 2025-11-29 07:05:39.832 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:41Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:05:41 np0005539504 podman[225819]: 2025-11-29 07:05:41.708844145 +0000 UTC m=+0.050573798 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:05:41 np0005539504 podman[225820]: 2025-11-29 07:05:41.730312765 +0000 UTC m=+0.065481390 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64)
Nov 29 02:05:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:42.373 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:43 np0005539504 nova_compute[187152]: 2025-11-29 07:05:43.893 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:43 np0005539504 nova_compute[187152]: 2025-11-29 07:05:43.926 187156 DEBUG nova.virt.libvirt.driver [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:05:44 np0005539504 nova_compute[187152]: 2025-11-29 07:05:44.833 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:45 np0005539504 nova_compute[187152]: 2025-11-29 07:05:45.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:46 np0005539504 kernel: tapb7078e73-f0 (unregistering): left promiscuous mode
Nov 29 02:05:46 np0005539504 NetworkManager[55210]: <info>  [1764399946.7716] device (tapb7078e73-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:05:46 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:46Z|00263|binding|INFO|Releasing lport b7078e73-f0e3-441a-843e-8920e38aec30 from this chassis (sb_readonly=0)
Nov 29 02:05:46 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:46Z|00264|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 down in Southbound
Nov 29 02:05:46 np0005539504 nova_compute[187152]: 2025-11-29 07:05:46.776 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:46 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:46Z|00265|binding|INFO|Removing iface tapb7078e73-f0 ovn-installed in OVS
Nov 29 02:05:46 np0005539504 nova_compute[187152]: 2025-11-29 07:05:46.794 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:46 np0005539504 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000042.scope: Deactivated successfully.
Nov 29 02:05:46 np0005539504 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000042.scope: Consumed 14.515s CPU time.
Nov 29 02:05:46 np0005539504 systemd-machined[153423]: Machine qemu-38-instance-00000042 terminated.
Nov 29 02:05:46 np0005539504 podman[225864]: 2025-11-29 07:05:46.86208127 +0000 UTC m=+0.055615794 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:05:46 np0005539504 podman[225866]: 2025-11-29 07:05:46.920530299 +0000 UTC m=+0.113482237 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller)
Nov 29 02:05:46 np0005539504 nova_compute[187152]: 2025-11-29 07:05:46.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:47.043 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:47.046 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:05:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:47.047 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:05:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:47.049 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f593eda4-d4f0-48c1-87ca-866d1c8301f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:47.050 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:05:47 np0005539504 nova_compute[187152]: 2025-11-29 07:05:47.059 187156 INFO nova.virt.libvirt.driver [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 02:05:47 np0005539504 nova_compute[187152]: 2025-11-29 07:05:47.066 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance destroyed successfully.#033[00m
Nov 29 02:05:47 np0005539504 nova_compute[187152]: 2025-11-29 07:05:47.067 187156 DEBUG nova.objects.instance [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'numa_topology' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:47 np0005539504 nova_compute[187152]: 2025-11-29 07:05:47.413 187156 DEBUG nova.compute.manager [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:47 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [NOTICE]   (225555) : haproxy version is 2.8.14-c23fe91
Nov 29 02:05:47 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [NOTICE]   (225555) : path to executable is /usr/sbin/haproxy
Nov 29 02:05:47 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [WARNING]  (225555) : Exiting Master process...
Nov 29 02:05:47 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [ALERT]    (225555) : Current worker (225557) exited with code 143 (Terminated)
Nov 29 02:05:47 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[225551]: [WARNING]  (225555) : All workers exited. Exiting... (0)
Nov 29 02:05:47 np0005539504 systemd[1]: libpod-c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850.scope: Deactivated successfully.
Nov 29 02:05:47 np0005539504 podman[225952]: 2025-11-29 07:05:47.872303192 +0000 UTC m=+0.722550322 container died c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.666 187156 DEBUG nova.compute.manager [req-ec4705ef-c01f-4759-b71a-df71d81db0ac req-9ac5c2ff-1aba-442e-ad97-c08714f7fd82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.666 187156 DEBUG oslo_concurrency.lockutils [req-ec4705ef-c01f-4759-b71a-df71d81db0ac req-9ac5c2ff-1aba-442e-ad97-c08714f7fd82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.667 187156 DEBUG oslo_concurrency.lockutils [req-ec4705ef-c01f-4759-b71a-df71d81db0ac req-9ac5c2ff-1aba-442e-ad97-c08714f7fd82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.667 187156 DEBUG oslo_concurrency.lockutils [req-ec4705ef-c01f-4759-b71a-df71d81db0ac req-9ac5c2ff-1aba-442e-ad97-c08714f7fd82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.667 187156 DEBUG nova.compute.manager [req-ec4705ef-c01f-4759-b71a-df71d81db0ac req-9ac5c2ff-1aba-442e-ad97-c08714f7fd82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.668 187156 WARNING nova.compute.manager [req-ec4705ef-c01f-4759-b71a-df71d81db0ac req-9ac5c2ff-1aba-442e-ad97-c08714f7fd82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 02:05:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850-userdata-shm.mount: Deactivated successfully.
Nov 29 02:05:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9af01e05100a6f33db87fd5355e439aafcacc0cad2456bc557328ce46250aebb-merged.mount: Deactivated successfully.
Nov 29 02:05:48 np0005539504 podman[225952]: 2025-11-29 07:05:48.712676428 +0000 UTC m=+1.562923558 container cleanup c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:05:48 np0005539504 systemd[1]: libpod-conmon-c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850.scope: Deactivated successfully.
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.845 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399933.8442993, 8bf08ec4-e207-48d0-a8cc-45ffed50be77 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.846 187156 INFO nova.compute.manager [-] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:48 np0005539504 nova_compute[187152]: 2025-11-29 07:05:48.896 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:49 np0005539504 podman[225983]: 2025-11-29 07:05:49.111036699 +0000 UTC m=+0.379196995 container remove c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.117 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4f0dbe-956d-41d5-b7a8-9a97eb362feb]: (4, ('Sat Nov 29 07:05:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850)\nc87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850\nSat Nov 29 07:05:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (c87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850)\nc87ecefe01e16e679c3e5cf5a0bb1e5443dda36885e446cba1619c49c2ce1850\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.119 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[992ecb92-71a0-43d5-a6a7-fdec5ac554d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.120 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.122 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:49 np0005539504 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.147 187156 DEBUG oslo_concurrency.lockutils [None req-3f6fa300-8594-4c24-9288-31a63abf795b e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 15.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.162 187156 DEBUG nova.compute.manager [None req-0e79e9ac-9996-4920-b2e5-7ffefccd87ae - - - - - -] [instance: 8bf08ec4-e207-48d0-a8cc-45ffed50be77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.272 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[90ad7fb9-dcb5-41d4-9e61-9a0decee9545]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.285 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.285 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1b673241-df80-48eb-8235-6466f38e0d2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.286 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e2cc0749-fe3e-4124-9bde-b20b180bad45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.301 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[499dbaba-f6f7-4e2e-9630-9e7a155cd736]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538855, 'reachable_time': 37694, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226002, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.303 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:05:49 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:49.303 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ec630c-4c1c-4b7f-bd5e-664cdcfdc133]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:49 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.835 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:49 np0005539504 nova_compute[187152]: 2025-11-29 07:05:49.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:50 np0005539504 nova_compute[187152]: 2025-11-29 07:05:50.845 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:50 np0005539504 nova_compute[187152]: 2025-11-29 07:05:50.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:50 np0005539504 nova_compute[187152]: 2025-11-29 07:05:50.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.092 187156 DEBUG nova.compute.manager [req-da22c167-86c8-4e6e-8d1d-b08669b437df req-12175bb0-3dee-4739-8470-b80f33fb8d8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.092 187156 DEBUG oslo_concurrency.lockutils [req-da22c167-86c8-4e6e-8d1d-b08669b437df req-12175bb0-3dee-4739-8470-b80f33fb8d8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.093 187156 DEBUG oslo_concurrency.lockutils [req-da22c167-86c8-4e6e-8d1d-b08669b437df req-12175bb0-3dee-4739-8470-b80f33fb8d8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.093 187156 DEBUG oslo_concurrency.lockutils [req-da22c167-86c8-4e6e-8d1d-b08669b437df req-12175bb0-3dee-4739-8470-b80f33fb8d8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.093 187156 DEBUG nova.compute.manager [req-da22c167-86c8-4e6e-8d1d-b08669b437df req-12175bb0-3dee-4739-8470-b80f33fb8d8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.093 187156 WARNING nova.compute.manager [req-da22c167-86c8-4e6e-8d1d-b08669b437df req-12175bb0-3dee-4739-8470-b80f33fb8d8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.161 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'info_cache' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.276 187156 DEBUG oslo_concurrency.lockutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.277 187156 DEBUG oslo_concurrency.lockutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:51 np0005539504 nova_compute[187152]: 2025-11-29 07:05:51.277 187156 DEBUG nova.network.neutron [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:53 np0005539504 nova_compute[187152]: 2025-11-29 07:05:53.899 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:53 np0005539504 nova_compute[187152]: 2025-11-29 07:05:53.939 187156 DEBUG nova.network.neutron [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.319 187156 DEBUG oslo_concurrency.lockutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.399 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance destroyed successfully.#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.399 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'numa_topology' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.649 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.724 187156 DEBUG nova.virt.libvirt.vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.724 187156 DEBUG nova.network.os_vif_util [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.727 187156 DEBUG nova.network.os_vif_util [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.727 187156 DEBUG os_vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.729 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.729 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7078e73-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.731 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.732 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.734 187156 INFO os_vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.741 187156 DEBUG nova.virt.libvirt.driver [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start _get_guest_xml network_info=[{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.744 187156 WARNING nova.virt.libvirt.driver [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.749 187156 DEBUG nova.virt.libvirt.host [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.750 187156 DEBUG nova.virt.libvirt.host [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.752 187156 DEBUG nova.virt.libvirt.host [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.752 187156 DEBUG nova.virt.libvirt.host [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.753 187156 DEBUG nova.virt.libvirt.driver [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.754 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.754 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.754 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.754 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.755 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.755 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.755 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.755 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.755 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.756 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.756 187156 DEBUG nova.virt.hardware [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.756 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.837 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:54 np0005539504 nova_compute[187152]: 2025-11-29 07:05:54.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.023 187156 DEBUG nova.virt.libvirt.vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.024 187156 DEBUG nova.network.os_vif_util [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.025 187156 DEBUG nova.network.os_vif_util [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.026 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.029 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.030 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.031 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.055 187156 DEBUG nova.virt.libvirt.driver [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <uuid>9223f44a-297e-4db1-9f44-ee0694c4e258</uuid>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <name>instance-00000042</name>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestJSON-server-664171356</nova:name>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:05:54</nova:creationTime>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        <nova:port uuid="b7078e73-f0e3-441a-843e-8920e38aec30">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <entry name="serial">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <entry name="uuid">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:1e:a3:23"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <target dev="tapb7078e73-f0"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log" append="off"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:05:55 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:05:55 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:05:55 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:05:55 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.057 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.074 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.117 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.118 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.181 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.182 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:55 np0005539504 podman[226009]: 2025-11-29 07:05:55.731287518 +0000 UTC m=+0.064731573 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.767 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.827 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.829 187156 DEBUG nova.virt.disk.api [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.830 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.899 187156 DEBUG oslo_concurrency.processutils [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.901 187156 DEBUG nova.virt.disk.api [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:55 np0005539504 nova_compute[187152]: 2025-11-29 07:05:55.901 187156 DEBUG nova.objects.instance [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:56 np0005539504 nova_compute[187152]: 2025-11-29 07:05:56.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.298 187156 DEBUG nova.virt.libvirt.vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.298 187156 DEBUG nova.network.os_vif_util [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.299 187156 DEBUG nova.network.os_vif_util [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.299 187156 DEBUG os_vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.300 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.300 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.301 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.306 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.306 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7078e73-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.307 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7078e73-f0, col_values=(('external_ids', {'iface-id': 'b7078e73-f0e3-441a-843e-8920e38aec30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a3:23', 'vm-uuid': '9223f44a-297e-4db1-9f44-ee0694c4e258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.308 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.308 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.3095] manager: (tapb7078e73-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.311 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.317 187156 INFO os_vif [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:05:57 np0005539504 kernel: tapb7078e73-f0: entered promiscuous mode
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.6742] manager: (tapb7078e73-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Nov 29 02:05:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:57Z|00266|binding|INFO|Claiming lport b7078e73-f0e3-441a-843e-8920e38aec30 for this chassis.
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.676 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:57Z|00267|binding|INFO|b7078e73-f0e3-441a-843e-8920e38aec30: Claiming fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.686 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.687 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.688 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:05:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:57Z|00268|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 ovn-installed in OVS
Nov 29 02:05:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:57Z|00269|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 up in Southbound
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.694 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.699 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[767e3017-6289-403b-bd0d-69d72105481d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.701 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.703 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.703 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[634e6d04-8952-47c2-81e0-3bf3341c53ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.705 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[115050df-c622-4a44-928b-ec4086b0b5dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 systemd-udevd[226053]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.716 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[752fa3ef-b897-4a33-a057-e9b3865efed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 systemd-machined[153423]: New machine qemu-40-instance-00000042.
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.7299] device (tapb7078e73-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.7313] device (tapb7078e73-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:05:57 np0005539504 systemd[1]: Started Virtual Machine qemu-40-instance-00000042.
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.741 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[16a7e8ed-674a-4809-9967-6c2163d62445]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.773 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6cbdd1-790b-4080-a3b5-9a6b513b1b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 systemd-udevd[226057]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.779 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a67f9604-6261-41e2-af31-9b4c391cd342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.7801] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/130)
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.811 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[890a755a-9772-4dc8-b63f-3fefc26cf0d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.814 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4f625de1-6016-45b5-990c-e67bf094d38a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.8373] device (tap9226dea3-60): carrier: link connected
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.842 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b7c761-1d59-41a7-8546-b30a2e193555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.859 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c455348d-fae5-42e6-b925-2f6aee98d888]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542071, 'reachable_time': 24459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226085, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.876 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[de3f84b3-1415-4ea0-ba3e-f5036b0721e7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542071, 'tstamp': 542071}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226086, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.891 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4436139e-630d-46cf-88f2-09972d91b321]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542071, 'reachable_time': 24459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226087, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.918 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[69dc29dc-a8e6-4582-b5eb-38d452b44175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.965 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.966 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.966 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.966 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.988 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3979e3-cdf1-4f26-a28c-90803cc5340f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.989 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.989 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.990 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:57 np0005539504 NetworkManager[55210]: <info>  [1764399957.9925] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.992 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539504 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:05:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:57.997 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:05:57Z|00270|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:05:57 np0005539504 nova_compute[187152]: 2025-11-29 07:05:57.998 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:58.019 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:58.020 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[860e3f61-cc9c-4c22-b9f9-47cbb86c4e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:58.021 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:05:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:05:58.021 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.186 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 9223f44a-297e-4db1-9f44-ee0694c4e258 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.187 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399958.1853015, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.187 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.190 187156 DEBUG nova.compute.manager [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.193 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.211 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance rebooted successfully.#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.212 187156 DEBUG nova.compute.manager [None req-85de637e-220e-473b-8d4e-1cd38abd5df6 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.249 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.250 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.318 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539504 podman[226133]: 2025-11-29 07:05:58.38543598 +0000 UTC m=+0.024749048 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.513 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.517 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:58 np0005539504 podman[226133]: 2025-11-29 07:05:58.525010406 +0000 UTC m=+0.164323444 container create 2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.537 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.539 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5686MB free_disk=73.16432189941406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.539 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.539 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:58 np0005539504 systemd[1]: Started libpod-conmon-2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f.scope.
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.597 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.599 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399958.1871161, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.599 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Started (Lifecycle Event)#033[00m
Nov 29 02:05:58 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:05:58 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f18755e09441fd87c37694bd0286bf2cc026c38d682eb27eb08b8dca1ac47d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:05:58 np0005539504 podman[226133]: 2025-11-29 07:05:58.704931248 +0000 UTC m=+0.344244346 container init 2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:05:58 np0005539504 podman[226133]: 2025-11-29 07:05:58.712561144 +0000 UTC m=+0.351874192 container start 2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:05:58 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [NOTICE]   (226153) : New worker (226155) forked
Nov 29 02:05:58 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [NOTICE]   (226153) : Loading success.
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.736 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.749 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.786 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 9223f44a-297e-4db1-9f44-ee0694c4e258 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.787 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.787 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.829 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.870 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.935 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:05:58 np0005539504 nova_compute[187152]: 2025-11-29 07:05:58.935 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.065 187156 DEBUG nova.compute.manager [req-ec84ae8d-2163-4dbd-9a36-6e4111e6b004 req-b9aa6b46-8a64-41dc-b38d-460446848153 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.065 187156 DEBUG oslo_concurrency.lockutils [req-ec84ae8d-2163-4dbd-9a36-6e4111e6b004 req-b9aa6b46-8a64-41dc-b38d-460446848153 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.065 187156 DEBUG oslo_concurrency.lockutils [req-ec84ae8d-2163-4dbd-9a36-6e4111e6b004 req-b9aa6b46-8a64-41dc-b38d-460446848153 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.066 187156 DEBUG oslo_concurrency.lockutils [req-ec84ae8d-2163-4dbd-9a36-6e4111e6b004 req-b9aa6b46-8a64-41dc-b38d-460446848153 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.066 187156 DEBUG nova.compute.manager [req-ec84ae8d-2163-4dbd-9a36-6e4111e6b004 req-b9aa6b46-8a64-41dc-b38d-460446848153 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.066 187156 WARNING nova.compute.manager [req-ec84ae8d-2163-4dbd-9a36-6e4111e6b004 req-b9aa6b46-8a64-41dc-b38d-460446848153 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:05:59 np0005539504 podman[226164]: 2025-11-29 07:05:59.72825681 +0000 UTC m=+0.067677553 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 02:05:59 np0005539504 nova_compute[187152]: 2025-11-29 07:05:59.839 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:01 np0005539504 nova_compute[187152]: 2025-11-29 07:06:01.619 187156 DEBUG nova.compute.manager [req-f2902fb0-e73d-4d2d-b7a8-657ad1f10086 req-2a6bcd33-9935-4d4e-9c47-765c74340dee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:01 np0005539504 nova_compute[187152]: 2025-11-29 07:06:01.620 187156 DEBUG oslo_concurrency.lockutils [req-f2902fb0-e73d-4d2d-b7a8-657ad1f10086 req-2a6bcd33-9935-4d4e-9c47-765c74340dee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:01 np0005539504 nova_compute[187152]: 2025-11-29 07:06:01.621 187156 DEBUG oslo_concurrency.lockutils [req-f2902fb0-e73d-4d2d-b7a8-657ad1f10086 req-2a6bcd33-9935-4d4e-9c47-765c74340dee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:01 np0005539504 nova_compute[187152]: 2025-11-29 07:06:01.622 187156 DEBUG oslo_concurrency.lockutils [req-f2902fb0-e73d-4d2d-b7a8-657ad1f10086 req-2a6bcd33-9935-4d4e-9c47-765c74340dee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:01 np0005539504 nova_compute[187152]: 2025-11-29 07:06:01.622 187156 DEBUG nova.compute.manager [req-f2902fb0-e73d-4d2d-b7a8-657ad1f10086 req-2a6bcd33-9935-4d4e-9c47-765c74340dee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:01 np0005539504 nova_compute[187152]: 2025-11-29 07:06:01.625 187156 WARNING nova.compute.manager [req-f2902fb0-e73d-4d2d-b7a8-657ad1f10086 req-2a6bcd33-9935-4d4e-9c47-765c74340dee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:02 np0005539504 nova_compute[187152]: 2025-11-29 07:06:02.308 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:04 np0005539504 nova_compute[187152]: 2025-11-29 07:06:04.841 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:07 np0005539504 nova_compute[187152]: 2025-11-29 07:06:07.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.414 187156 INFO nova.compute.manager [None req-fa8bbfec-89b9-4cc9-a1b9-13aa5dee29b8 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Pausing#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.416 187156 DEBUG nova.objects.instance [None req-fa8bbfec-89b9-4cc9-a1b9-13aa5dee29b8 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.910 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399968.909581, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.911 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.914 187156 DEBUG nova.compute.manager [None req-fa8bbfec-89b9-4cc9-a1b9-13aa5dee29b8 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.951 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:08 np0005539504 nova_compute[187152]: 2025-11-29 07:06:08.957 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:09 np0005539504 nova_compute[187152]: 2025-11-29 07:06:09.011 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539504 nova_compute[187152]: 2025-11-29 07:06:09.015 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 29 02:06:09 np0005539504 podman[226186]: 2025-11-29 07:06:09.740418521 +0000 UTC m=+0.064149258 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 02:06:09 np0005539504 nova_compute[187152]: 2025-11-29 07:06:09.842 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.512 187156 INFO nova.compute.manager [None req-29d13320-dbcf-4048-85a8-8a1de6dc9bb3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Unpausing#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.513 187156 DEBUG nova.objects.instance [None req-29d13320-dbcf-4048-85a8-8a1de6dc9bb3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.555 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399970.5556242, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.556 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:06:10 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.560 187156 DEBUG nova.virt.libvirt.guest [None req-29d13320-dbcf-4048-85a8-8a1de6dc9bb3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.560 187156 DEBUG nova.compute.manager [None req-29d13320-dbcf-4048-85a8-8a1de6dc9bb3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.600 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.604 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:10 np0005539504 nova_compute[187152]: 2025-11-29 07:06:10.643 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Nov 29 02:06:12 np0005539504 nova_compute[187152]: 2025-11-29 07:06:12.327 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:12 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:12Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:06:12 np0005539504 podman[226220]: 2025-11-29 07:06:12.717673019 +0000 UTC m=+0.055352920 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:06:12 np0005539504 podman[226221]: 2025-11-29 07:06:12.737994347 +0000 UTC m=+0.068658499 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:06:14 np0005539504 nova_compute[187152]: 2025-11-29 07:06:14.845 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.329 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:17 np0005539504 podman[226265]: 2025-11-29 07:06:17.703848933 +0000 UTC m=+0.049418471 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.729 187156 DEBUG oslo_concurrency.lockutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.730 187156 DEBUG oslo_concurrency.lockutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.730 187156 INFO nova.compute.manager [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Rebooting instance#033[00m
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.743 187156 DEBUG oslo_concurrency.lockutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.743 187156 DEBUG oslo_concurrency.lockutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:17 np0005539504 nova_compute[187152]: 2025-11-29 07:06:17.744 187156 DEBUG nova.network.neutron [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:06:17 np0005539504 podman[226266]: 2025-11-29 07:06:17.748013152 +0000 UTC m=+0.088438141 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:06:19 np0005539504 nova_compute[187152]: 2025-11-29 07:06:19.849 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.184 187156 DEBUG nova.network.neutron [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.206 187156 DEBUG oslo_concurrency.lockutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.221 187156 DEBUG nova.compute.manager [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:20 np0005539504 kernel: tapb7078e73-f0 (unregistering): left promiscuous mode
Nov 29 02:06:20 np0005539504 NetworkManager[55210]: <info>  [1764399980.3791] device (tapb7078e73-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:06:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:20Z|00271|binding|INFO|Releasing lport b7078e73-f0e3-441a-843e-8920e38aec30 from this chassis (sb_readonly=0)
Nov 29 02:06:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:20Z|00272|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 down in Southbound
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.385 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:20Z|00273|binding|INFO|Removing iface tapb7078e73-f0 ovn-installed in OVS
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.388 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.401 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.402 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.404 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.405 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8d7dee-e5ad-4313-8df4-067a5dfd6da7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.406 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:06:20 np0005539504 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000042.scope: Deactivated successfully.
Nov 29 02:06:20 np0005539504 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000042.scope: Consumed 13.967s CPU time.
Nov 29 02:06:20 np0005539504 systemd-machined[153423]: Machine qemu-40-instance-00000042 terminated.
Nov 29 02:06:20 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [NOTICE]   (226153) : haproxy version is 2.8.14-c23fe91
Nov 29 02:06:20 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [NOTICE]   (226153) : path to executable is /usr/sbin/haproxy
Nov 29 02:06:20 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [WARNING]  (226153) : Exiting Master process...
Nov 29 02:06:20 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [ALERT]    (226153) : Current worker (226155) exited with code 143 (Terminated)
Nov 29 02:06:20 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226148]: [WARNING]  (226153) : All workers exited. Exiting... (0)
Nov 29 02:06:20 np0005539504 systemd[1]: libpod-2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f.scope: Deactivated successfully.
Nov 29 02:06:20 np0005539504 podman[226338]: 2025-11-29 07:06:20.549709815 +0000 UTC m=+0.050974703 container died 2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:06:20 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:06:20 np0005539504 systemd[1]: var-lib-containers-storage-overlay-8f18755e09441fd87c37694bd0286bf2cc026c38d682eb27eb08b8dca1ac47d7-merged.mount: Deactivated successfully.
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.580 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 podman[226338]: 2025-11-29 07:06:20.584724297 +0000 UTC m=+0.085989185 container cleanup 2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:06:20 np0005539504 systemd[1]: libpod-conmon-2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f.scope: Deactivated successfully.
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.621 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance destroyed successfully.#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.621 187156 DEBUG nova.objects.instance [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.636 187156 DEBUG nova.virt.libvirt.vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.637 187156 DEBUG nova.network.os_vif_util [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.637 187156 DEBUG nova.network.os_vif_util [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.638 187156 DEBUG os_vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.639 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.640 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7078e73-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.642 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.643 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.645 187156 INFO os_vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:06:20 np0005539504 podman[226376]: 2025-11-29 07:06:20.649586052 +0000 UTC m=+0.040111030 container remove 2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.651 187156 DEBUG nova.virt.libvirt.driver [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start _get_guest_xml network_info=[{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.654 187156 WARNING nova.virt.libvirt.driver [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.654 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4541b1-098c-40b8-bb18-113ba6e7de4c]: (4, ('Sat Nov 29 07:06:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f)\n2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f\nSat Nov 29 07:06:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f)\n2e0c480d55cb6e49a4c73788af8d7ffa96688bdd6ae0b350a3005b32afd9b72f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.656 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f15213e5-3061-4c04-9612-afd3e7ee1688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.657 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.658 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.660 187156 DEBUG nova.virt.libvirt.host [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.661 187156 DEBUG nova.virt.libvirt.host [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.664 187156 DEBUG nova.virt.libvirt.host [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.665 187156 DEBUG nova.virt.libvirt.host [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.666 187156 DEBUG nova.virt.libvirt.driver [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.666 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.667 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.667 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.667 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.667 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.668 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.668 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.668 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.668 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.669 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.669 187156 DEBUG nova.virt.hardware [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.669 187156 DEBUG nova.objects.instance [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.671 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.676 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d8979015-8df5-4dac-8e4b-b23535feb218]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.684 187156 DEBUG nova.virt.libvirt.vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.685 187156 DEBUG nova.network.os_vif_util [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.685 187156 DEBUG nova.network.os_vif_util [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.687 187156 DEBUG nova.objects.instance [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.699 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab176c8-742b-4b96-9067-5ca2d819609b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.700 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fad15b0f-2fa9-42ae-87e5-16844e824d1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.701 187156 DEBUG nova.virt.libvirt.driver [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <uuid>9223f44a-297e-4db1-9f44-ee0694c4e258</uuid>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <name>instance-00000042</name>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestJSON-server-664171356</nova:name>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:06:20</nova:creationTime>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        <nova:port uuid="b7078e73-f0e3-441a-843e-8920e38aec30">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <entry name="serial">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <entry name="uuid">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:1e:a3:23"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <target dev="tapb7078e73-f0"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log" append="off"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:06:20 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:06:20 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:06:20 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:06:20 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.704 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.714 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b49d1451-5c24-49b3-bcf6-c87a2fa5fbb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542064, 'reachable_time': 36966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226397, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.717 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:06:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:20.717 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[f4cce621-41e8-4b8b-8e1a-eb2d9ee1e1b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:20 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.760 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.762 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.814 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.817 187156 DEBUG nova.objects.instance [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.835 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.901 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.902 187156 DEBUG nova.virt.disk.api [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.903 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.955 187156 DEBUG oslo_concurrency.processutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.956 187156 DEBUG nova.virt.disk.api [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.956 187156 DEBUG nova.objects.instance [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.971 187156 DEBUG nova.virt.libvirt.vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.971 187156 DEBUG nova.network.os_vif_util [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.972 187156 DEBUG nova.network.os_vif_util [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.973 187156 DEBUG os_vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.974 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.974 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.979 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.979 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7078e73-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.980 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7078e73-f0, col_values=(('external_ids', {'iface-id': 'b7078e73-f0e3-441a-843e-8920e38aec30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a3:23', 'vm-uuid': '9223f44a-297e-4db1-9f44-ee0694c4e258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.982 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 NetworkManager[55210]: <info>  [1764399980.9835] manager: (tapb7078e73-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.984 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.989 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539504 nova_compute[187152]: 2025-11-29 07:06:20.989 187156 INFO os_vif [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:06:21 np0005539504 kernel: tapb7078e73-f0: entered promiscuous mode
Nov 29 02:06:21 np0005539504 systemd-udevd[226320]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:21 np0005539504 NetworkManager[55210]: <info>  [1764399981.0730] manager: (tapb7078e73-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Nov 29 02:06:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:21Z|00274|binding|INFO|Claiming lport b7078e73-f0e3-441a-843e-8920e38aec30 for this chassis.
Nov 29 02:06:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:21Z|00275|binding|INFO|b7078e73-f0e3-441a-843e-8920e38aec30: Claiming fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.075 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.081 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.083 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:06:21 np0005539504 NetworkManager[55210]: <info>  [1764399981.0836] device (tapb7078e73-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.085 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:06:21 np0005539504 NetworkManager[55210]: <info>  [1764399981.0862] device (tapb7078e73-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:06:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:21Z|00276|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 ovn-installed in OVS
Nov 29 02:06:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:21Z|00277|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 up in Southbound
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.088 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.090 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.102 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3b833853-8929-4f55-813b-a593d6539296]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.103 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.105 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.105 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2a4fb9-9ab2-4dda-b5bb-a15350894bfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.105 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d67e29-ed7b-458b-88c2-a1856aca0c4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 systemd-machined[153423]: New machine qemu-41-instance-00000042.
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.120 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[557c8aa2-a372-4c41-a7c8-299dc1c1ccbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 systemd[1]: Started Virtual Machine qemu-41-instance-00000042.
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.147 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e55b8d2a-e2e6-477a-bb0f-740bab0287b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.180 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[598a008a-b566-4178-92fb-c817824118a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.187 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[49903081-8519-4f6e-9d91-de0f5b0a4c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 NetworkManager[55210]: <info>  [1764399981.1890] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/134)
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.226 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2de05c19-b8f7-47c7-a169-e03f7c9d1618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.231 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[9731832a-2bd5-4435-8631-1533291b0069]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 NetworkManager[55210]: <info>  [1764399981.2523] device (tap9226dea3-60): carrier: link connected
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.255 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0c58271c-6024-4550-8ded-1213581a57b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.277 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c06c1b-b3ca-4615-a9d0-570eeb5a3e34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544413, 'reachable_time': 20138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226458, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.295 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d7aa2a8e-4fed-4648-8a62-a0a0f5d8b739]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 544413, 'tstamp': 544413}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226459, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.319 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bae928dc-9085-4688-8924-07e59daac6d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544413, 'reachable_time': 20138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226460, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.352 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[15247038-5fc9-4b8b-a87a-e170f8e7407b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.417 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[41c60491-35dd-4a8d-b6de-7a04b1167836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.420 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.420 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.421 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.465 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 NetworkManager[55210]: <info>  [1764399981.4658] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Nov 29 02:06:21 np0005539504 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.469 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.470 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.471 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:21Z|00278|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.474 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.475 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[58500af7-7ac3-44ea-a1e6-90bcfa0d556e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.476 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:06:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:21.477 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.484 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.498 187156 DEBUG nova.compute.manager [req-e9a6c8e9-917f-4232-b778-19c4eef79ff3 req-375f8154-24e9-4afb-8b5b-bc55e60a3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.499 187156 DEBUG oslo_concurrency.lockutils [req-e9a6c8e9-917f-4232-b778-19c4eef79ff3 req-375f8154-24e9-4afb-8b5b-bc55e60a3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.499 187156 DEBUG oslo_concurrency.lockutils [req-e9a6c8e9-917f-4232-b778-19c4eef79ff3 req-375f8154-24e9-4afb-8b5b-bc55e60a3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.499 187156 DEBUG oslo_concurrency.lockutils [req-e9a6c8e9-917f-4232-b778-19c4eef79ff3 req-375f8154-24e9-4afb-8b5b-bc55e60a3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.500 187156 DEBUG nova.compute.manager [req-e9a6c8e9-917f-4232-b778-19c4eef79ff3 req-375f8154-24e9-4afb-8b5b-bc55e60a3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:21 np0005539504 nova_compute[187152]: 2025-11-29 07:06:21.500 187156 WARNING nova.compute.manager [req-e9a6c8e9-917f-4232-b778-19c4eef79ff3 req-375f8154-24e9-4afb-8b5b-bc55e60a3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Nov 29 02:06:21 np0005539504 podman[226492]: 2025-11-29 07:06:21.83365321 +0000 UTC m=+0.024025768 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:06:22 np0005539504 podman[226492]: 2025-11-29 07:06:22.166700264 +0000 UTC m=+0.357072802 container create 32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:06:22 np0005539504 systemd[1]: Started libpod-conmon-32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695.scope.
Nov 29 02:06:22 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:06:22 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc842b2649017d48f38e2a182ca538b2723b110f4a460a6ef353185e7fa6e04f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:06:22 np0005539504 podman[226492]: 2025-11-29 07:06:22.285803679 +0000 UTC m=+0.476176227 container init 32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:06:22 np0005539504 podman[226492]: 2025-11-29 07:06:22.298526881 +0000 UTC m=+0.488899449 container start 32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:06:22 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [NOTICE]   (226511) : New worker (226513) forked
Nov 29 02:06:22 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [NOTICE]   (226511) : Loading success.
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.456 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 9223f44a-297e-4db1-9f44-ee0694c4e258 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.456 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399982.455766, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.457 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.460 187156 DEBUG nova.compute.manager [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.467 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance rebooted successfully.#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.468 187156 DEBUG nova.compute.manager [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.477 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.482 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.520 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.520 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764399982.4571786, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.521 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Started (Lifecycle Event)#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.547 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.552 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.573 187156 DEBUG oslo_concurrency.lockutils [None req-872056a0-59db-48ea-92bd-ffa30e7f53d3 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:22Z|00279|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:06:22 np0005539504 nova_compute[187152]: 2025-11-29 07:06:22.885 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:22.921 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:22.922 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:22.923 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.112 187156 DEBUG nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.113 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.114 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.114 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.115 187156 DEBUG nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.115 187156 WARNING nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.115 187156 DEBUG nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.116 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.116 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.117 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.117 187156 DEBUG nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.117 187156 WARNING nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.117 187156 DEBUG nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.118 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.118 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.118 187156 DEBUG oslo_concurrency.lockutils [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.119 187156 DEBUG nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.119 187156 WARNING nova.compute.manager [req-edd5857b-168b-4d4d-b9c1-68e1a9fe5a15 req-f6f4f445-9260-4448-8c21-7eace93e33aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:24 np0005539504 nova_compute[187152]: 2025-11-29 07:06:24.851 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:25 np0005539504 nova_compute[187152]: 2025-11-29 07:06:25.984 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:26Z|00280|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:06:26 np0005539504 nova_compute[187152]: 2025-11-29 07:06:26.191 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:26 np0005539504 podman[226530]: 2025-11-29 07:06:26.750374476 +0000 UTC m=+0.089226492 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:06:29 np0005539504 nova_compute[187152]: 2025-11-29 07:06:29.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:30 np0005539504 podman[226551]: 2025-11-29 07:06:30.732720045 +0000 UTC m=+0.074509777 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 29 02:06:30 np0005539504 nova_compute[187152]: 2025-11-29 07:06:30.987 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:33 np0005539504 nova_compute[187152]: 2025-11-29 07:06:33.118 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:34.428 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:34 np0005539504 nova_compute[187152]: 2025-11-29 07:06:34.428 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:34.432 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:06:34 np0005539504 nova_compute[187152]: 2025-11-29 07:06:34.855 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:35 np0005539504 nova_compute[187152]: 2025-11-29 07:06:35.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:35Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:06:35 np0005539504 nova_compute[187152]: 2025-11-29 07:06:35.706 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:35 np0005539504 nova_compute[187152]: 2025-11-29 07:06:35.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.136 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:38.436 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.822 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.823 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.841 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.969 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.970 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.981 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:06:38 np0005539504 nova_compute[187152]: 2025-11-29 07:06:38.982 187156 INFO nova.compute.claims [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.141 187156 DEBUG nova.compute.provider_tree [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.155 187156 DEBUG nova.scheduler.client.report [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.175 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.177 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.251 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.251 187156 DEBUG nova.network.neutron [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.268 187156 INFO nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.289 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.390 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.393 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.394 187156 INFO nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Creating image(s)#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.396 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "/var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.397 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "/var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.398 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "/var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.430 187156 DEBUG nova.policy [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '51cb6e189f2c4d608d37ef464a33540e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '68b64b8a24d349adb1e6e5d6822784fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.434 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.488 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.490 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.491 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.507 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.565 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.566 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.603 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.605 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.605 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.668 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.671 187156 DEBUG nova.virt.disk.api [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Checking if we can resize image /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.672 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.740 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.742 187156 DEBUG nova.virt.disk.api [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Cannot resize image /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.743 187156 DEBUG nova.objects.instance [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lazy-loading 'migration_context' on Instance uuid 059e8710-61a4-4010-ae59-3b17604e8cd2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.765 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.766 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Ensure instance console log exists: /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.767 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.768 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.769 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:39 np0005539504 nova_compute[187152]: 2025-11-29 07:06:39.857 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:40 np0005539504 nova_compute[187152]: 2025-11-29 07:06:40.350 187156 DEBUG nova.network.neutron [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Successfully created port: 11f546fc-7a81-4975-9415-d27b7f14bd30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:06:40 np0005539504 podman[226594]: 2025-11-29 07:06:40.74873073 +0000 UTC m=+0.086477408 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:06:40 np0005539504 nova_compute[187152]: 2025-11-29 07:06:40.993 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.386 187156 DEBUG nova.network.neutron [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Successfully updated port: 11f546fc-7a81-4975-9415-d27b7f14bd30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.414 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "refresh_cache-059e8710-61a4-4010-ae59-3b17604e8cd2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.415 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquired lock "refresh_cache-059e8710-61a4-4010-ae59-3b17604e8cd2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.415 187156 DEBUG nova.network.neutron [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.480 187156 DEBUG nova.compute.manager [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received event network-changed-11f546fc-7a81-4975-9415-d27b7f14bd30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.480 187156 DEBUG nova.compute.manager [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Refreshing instance network info cache due to event network-changed-11f546fc-7a81-4975-9415-d27b7f14bd30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.481 187156 DEBUG oslo_concurrency.lockutils [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-059e8710-61a4-4010-ae59-3b17604e8cd2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:41 np0005539504 nova_compute[187152]: 2025-11-29 07:06:41.645 187156 DEBUG nova.network.neutron [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.864 187156 DEBUG nova.network.neutron [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Updating instance_info_cache with network_info: [{"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.980 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Releasing lock "refresh_cache-059e8710-61a4-4010-ae59-3b17604e8cd2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.981 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Instance network_info: |[{"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.981 187156 DEBUG oslo_concurrency.lockutils [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-059e8710-61a4-4010-ae59-3b17604e8cd2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.982 187156 DEBUG nova.network.neutron [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Refreshing network info cache for port 11f546fc-7a81-4975-9415-d27b7f14bd30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.985 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Start _get_guest_xml network_info=[{"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.990 187156 WARNING nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.995 187156 DEBUG nova.virt.libvirt.host [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:06:42 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.996 187156 DEBUG nova.virt.libvirt.host [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:42.999 187156 DEBUG nova.virt.libvirt.host [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.000 187156 DEBUG nova.virt.libvirt.host [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.001 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.002 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.003 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.003 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.003 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.004 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.004 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.004 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.004 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.005 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.005 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.005 187156 DEBUG nova.virt.hardware [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.009 187156 DEBUG nova.virt.libvirt.vif [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:06:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-1891149216',display_name='tempest-InstanceActionsNegativeTestJSON-server-1891149216',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-1891149216',id=78,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='68b64b8a24d349adb1e6e5d6822784fc',ramdisk_id='',reservation_id='r-xd3dr0rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1535726508',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1535726508-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:39Z,user_data=None,user_id='51cb6e189f2c4d608d37ef464a33540e',uuid=059e8710-61a4-4010-ae59-3b17604e8cd2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.010 187156 DEBUG nova.network.os_vif_util [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Converting VIF {"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.011 187156 DEBUG nova.network.os_vif_util [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.012 187156 DEBUG nova.objects.instance [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 059e8710-61a4-4010-ae59-3b17604e8cd2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.033 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <uuid>059e8710-61a4-4010-ae59-3b17604e8cd2</uuid>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <name>instance-0000004e</name>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:name>tempest-InstanceActionsNegativeTestJSON-server-1891149216</nova:name>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:06:42</nova:creationTime>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:user uuid="51cb6e189f2c4d608d37ef464a33540e">tempest-InstanceActionsNegativeTestJSON-1535726508-project-member</nova:user>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:project uuid="68b64b8a24d349adb1e6e5d6822784fc">tempest-InstanceActionsNegativeTestJSON-1535726508</nova:project>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        <nova:port uuid="11f546fc-7a81-4975-9415-d27b7f14bd30">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <entry name="serial">059e8710-61a4-4010-ae59-3b17604e8cd2</entry>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <entry name="uuid">059e8710-61a4-4010-ae59-3b17604e8cd2</entry>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.config"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:6b:71:06"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <target dev="tap11f546fc-7a"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/console.log" append="off"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:06:43 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:06:43 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:06:43 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:06:43 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.035 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Preparing to wait for external event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.036 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.036 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.036 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.037 187156 DEBUG nova.virt.libvirt.vif [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:06:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-1891149216',display_name='tempest-InstanceActionsNegativeTestJSON-server-1891149216',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-1891149216',id=78,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='68b64b8a24d349adb1e6e5d6822784fc',ramdisk_id='',reservation_id='r-xd3dr0rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1535726508',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1535726508-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:39Z,user_data=None,user_id='51cb6e189f2c4d608d37ef464a33540e',uuid=059e8710-61a4-4010-ae59-3b17604e8cd2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.037 187156 DEBUG nova.network.os_vif_util [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Converting VIF {"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.037 187156 DEBUG nova.network.os_vif_util [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.038 187156 DEBUG os_vif [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.038 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.039 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.039 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.044 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.044 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11f546fc-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.044 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap11f546fc-7a, col_values=(('external_ids', {'iface-id': '11f546fc-7a81-4975-9415-d27b7f14bd30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:71:06', 'vm-uuid': '059e8710-61a4-4010-ae59-3b17604e8cd2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.045 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:43 np0005539504 NetworkManager[55210]: <info>  [1764400003.0475] manager: (tap11f546fc-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.048 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.054 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.055 187156 INFO os_vif [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a')#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.146 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.147 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.147 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] No VIF found with MAC fa:16:3e:6b:71:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.148 187156 INFO nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Using config drive#033[00m
Nov 29 02:06:43 np0005539504 podman[226617]: 2025-11-29 07:06:43.16092923 +0000 UTC m=+0.064318012 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:06:43 np0005539504 podman[226619]: 2025-11-29 07:06:43.163570461 +0000 UTC m=+0.067628971 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.541 187156 INFO nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Creating config drive at /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.config#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.548 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3hy2b8hd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.676 187156 DEBUG oslo_concurrency.processutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3hy2b8hd" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:43 np0005539504 kernel: tap11f546fc-7a: entered promiscuous mode
Nov 29 02:06:43 np0005539504 NetworkManager[55210]: <info>  [1764400003.7507] manager: (tap11f546fc-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/137)
Nov 29 02:06:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:43Z|00281|binding|INFO|Claiming lport 11f546fc-7a81-4975-9415-d27b7f14bd30 for this chassis.
Nov 29 02:06:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:43Z|00282|binding|INFO|11f546fc-7a81-4975-9415-d27b7f14bd30: Claiming fa:16:3e:6b:71:06 10.100.0.3
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.752 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:43Z|00283|binding|INFO|Setting lport 11f546fc-7a81-4975-9415-d27b7f14bd30 ovn-installed in OVS
Nov 29 02:06:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:43Z|00284|binding|INFO|Setting lport 11f546fc-7a81-4975-9415-d27b7f14bd30 up in Southbound
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.766 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:71:06 10.100.0.3'], port_security=['fa:16:3e:6b:71:06 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '059e8710-61a4-4010-ae59-3b17604e8cd2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b22db185-b399-445e-82e5-2a4ef0dadc29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b64b8a24d349adb1e6e5d6822784fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0bb8ff1f-bd55-4198-983f-c7e3cc1b99bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f42a398-69d8-451c-8a2e-7c3d2083ff31, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=11f546fc-7a81-4975-9415-d27b7f14bd30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:43 np0005539504 nova_compute[187152]: 2025-11-29 07:06:43.768 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.768 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 11f546fc-7a81-4975-9415-d27b7f14bd30 in datapath b22db185-b399-445e-82e5-2a4ef0dadc29 bound to our chassis#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.769 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b22db185-b399-445e-82e5-2a4ef0dadc29#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.783 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[75bfebc1-6bb0-417e-a4c3-3a2fc844cb6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.785 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb22db185-b1 in ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.787 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb22db185-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.787 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[314d7244-af8b-4f5c-9358-6abcb20f9f1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 systemd-udevd[226677]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:43 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.790 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0ace3a08-d2c9-4378-b23b-1d892715e15f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:06:43 np0005539504 systemd-machined[153423]: New machine qemu-42-instance-0000004e.
Nov 29 02:06:43 np0005539504 NetworkManager[55210]: <info>  [1764400003.8041] device (tap11f546fc-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:06:43 np0005539504 NetworkManager[55210]: <info>  [1764400003.8050] device (tap11f546fc-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.808 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[086cddd5-c9a8-4c40-b5b6-cdfc189c008f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 systemd[1]: Started Virtual Machine qemu-42-instance-0000004e.
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.836 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c033a19a-6a12-4d40-bebe-db2c3fc76a51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.875 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b37509eb-4fa8-4a0d-ba39-de4766f23dc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.881 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3edf55d8-243d-4748-a72c-407b7d032be0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 NetworkManager[55210]: <info>  [1764400003.8826] manager: (tapb22db185-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/138)
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.919 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2ba398-0e16-4a97-8bbf-3a3dcac1f22d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.923 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2eef9b35-5512-4934-84ce-627f50483376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 NetworkManager[55210]: <info>  [1764400003.9529] device (tapb22db185-b0): carrier: link connected
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.961 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[66f1c2da-6ffa-42fa-9713-eee2489249ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.978 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c0f239-f32d-45b7-b438-50da822046b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb22db185-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:bf:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546683, 'reachable_time': 37814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226711, 'error': None, 'target': 'ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:43.996 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[77c001dc-041e-4472-93dc-e1c473d6534d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:bfb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546683, 'tstamp': 546683}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226712, 'error': None, 'target': 'ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.017 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1be9ca05-6a00-4828-a832-1e3be01439f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb22db185-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:bf:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546683, 'reachable_time': 37814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226713, 'error': None, 'target': 'ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.058 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1673ddd8-36c2-4b6c-9198-32752ffec928]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.128 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0064b8-4499-4551-a1f9-95286a90b667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.129 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb22db185-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.129 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.130 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb22db185-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:44 np0005539504 kernel: tapb22db185-b0: entered promiscuous mode
Nov 29 02:06:44 np0005539504 NetworkManager[55210]: <info>  [1764400004.1321] manager: (tapb22db185-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.131 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.139 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb22db185-b0, col_values=(('external_ids', {'iface-id': '1a3a07c5-273a-40c2-85e2-a761bd171efa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.140 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:44Z|00285|binding|INFO|Releasing lport 1a3a07c5-273a-40c2-85e2-a761bd171efa from this chassis (sb_readonly=0)
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.141 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b22db185-b399-445e-82e5-2a4ef0dadc29.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b22db185-b399-445e-82e5-2a4ef0dadc29.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.142 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbb211e-1afc-4834-bd7e-04959f44df6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.143 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-b22db185-b399-445e-82e5-2a4ef0dadc29
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/b22db185-b399-445e-82e5-2a4ef0dadc29.pid.haproxy
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID b22db185-b399-445e-82e5-2a4ef0dadc29
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:06:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:44.143 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29', 'env', 'PROCESS_TAG=haproxy-b22db185-b399-445e-82e5-2a4ef0dadc29', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b22db185-b399-445e-82e5-2a4ef0dadc29.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.152 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.185 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400004.1835866, 059e8710-61a4-4010-ae59-3b17604e8cd2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.185 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] VM Started (Lifecycle Event)#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.273 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.280 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400004.1840856, 059e8710-61a4-4010-ae59-3b17604e8cd2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.280 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.307 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.312 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.356 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:44 np0005539504 podman[226752]: 2025-11-29 07:06:44.564227677 +0000 UTC m=+0.056322386 container create 137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.584 187156 DEBUG nova.network.neutron [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Updated VIF entry in instance network info cache for port 11f546fc-7a81-4975-9415-d27b7f14bd30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.585 187156 DEBUG nova.network.neutron [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Updating instance_info_cache with network_info: [{"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.602 187156 DEBUG oslo_concurrency.lockutils [req-e730b8ea-c58c-42fe-ab43-de42093b2218 req-9cd41f96-3961-490d-acec-decabc0da31f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-059e8710-61a4-4010-ae59-3b17604e8cd2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:44 np0005539504 systemd[1]: Started libpod-conmon-137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7.scope.
Nov 29 02:06:44 np0005539504 podman[226752]: 2025-11-29 07:06:44.535973177 +0000 UTC m=+0.028067906 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:06:44 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:06:44 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c705506d0a5e079aaa64396d6e046efe281262a51fdbeb8b3ab415ab2c6816d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:06:44 np0005539504 podman[226752]: 2025-11-29 07:06:44.65461093 +0000 UTC m=+0.146705669 container init 137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:06:44 np0005539504 podman[226752]: 2025-11-29 07:06:44.660682633 +0000 UTC m=+0.152777342 container start 137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:06:44 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [NOTICE]   (226771) : New worker (226773) forked
Nov 29 02:06:44 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [NOTICE]   (226771) : Loading success.
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.703 187156 DEBUG nova.compute.manager [req-62d5d1b9-f671-42cb-ab13-b80c6c552db9 req-fee70dac-0ddd-4e1e-b07a-0be51344ec21 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.703 187156 DEBUG oslo_concurrency.lockutils [req-62d5d1b9-f671-42cb-ab13-b80c6c552db9 req-fee70dac-0ddd-4e1e-b07a-0be51344ec21 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.703 187156 DEBUG oslo_concurrency.lockutils [req-62d5d1b9-f671-42cb-ab13-b80c6c552db9 req-fee70dac-0ddd-4e1e-b07a-0be51344ec21 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.704 187156 DEBUG oslo_concurrency.lockutils [req-62d5d1b9-f671-42cb-ab13-b80c6c552db9 req-fee70dac-0ddd-4e1e-b07a-0be51344ec21 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.704 187156 DEBUG nova.compute.manager [req-62d5d1b9-f671-42cb-ab13-b80c6c552db9 req-fee70dac-0ddd-4e1e-b07a-0be51344ec21 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Processing event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.704 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.708 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400004.7087035, 059e8710-61a4-4010-ae59-3b17604e8cd2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.709 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.711 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.714 187156 INFO nova.virt.libvirt.driver [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Instance spawned successfully.#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.714 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.738 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.743 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.743 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.744 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.745 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.745 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.745 187156 DEBUG nova.virt.libvirt.driver [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.752 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:44 np0005539504 nova_compute[187152]: 2025-11-29 07:06:44.859 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:45 np0005539504 nova_compute[187152]: 2025-11-29 07:06:45.153 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:45 np0005539504 nova_compute[187152]: 2025-11-29 07:06:45.188 187156 INFO nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Took 5.80 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:06:45 np0005539504 nova_compute[187152]: 2025-11-29 07:06:45.189 187156 DEBUG nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:45 np0005539504 nova_compute[187152]: 2025-11-29 07:06:45.311 187156 INFO nova.compute.manager [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Took 6.38 seconds to build instance.#033[00m
Nov 29 02:06:45 np0005539504 nova_compute[187152]: 2025-11-29 07:06:45.331 187156 DEBUG oslo_concurrency.lockutils [None req-ff6d5023-2e87-4b9f-9cfe-21a9eea932ed 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.249 187156 DEBUG nova.compute.manager [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.250 187156 DEBUG oslo_concurrency.lockutils [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.250 187156 DEBUG oslo_concurrency.lockutils [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.250 187156 DEBUG oslo_concurrency.lockutils [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.250 187156 DEBUG nova.compute.manager [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] No waiting events found dispatching network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.250 187156 WARNING nova.compute.manager [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received unexpected event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.570 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.571 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.571 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.572 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.572 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.584 187156 INFO nova.compute.manager [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Terminating instance#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.597 187156 DEBUG nova.compute.manager [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:06:47 np0005539504 kernel: tap11f546fc-7a (unregistering): left promiscuous mode
Nov 29 02:06:47 np0005539504 NetworkManager[55210]: <info>  [1764400007.6298] device (tap11f546fc-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:06:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:47Z|00286|binding|INFO|Releasing lport 11f546fc-7a81-4975-9415-d27b7f14bd30 from this chassis (sb_readonly=0)
Nov 29 02:06:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:47Z|00287|binding|INFO|Setting lport 11f546fc-7a81-4975-9415-d27b7f14bd30 down in Southbound
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:47Z|00288|binding|INFO|Removing iface tap11f546fc-7a ovn-installed in OVS
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:47.651 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:71:06 10.100.0.3'], port_security=['fa:16:3e:6b:71:06 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '059e8710-61a4-4010-ae59-3b17604e8cd2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b22db185-b399-445e-82e5-2a4ef0dadc29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b64b8a24d349adb1e6e5d6822784fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0bb8ff1f-bd55-4198-983f-c7e3cc1b99bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f42a398-69d8-451c-8a2e-7c3d2083ff31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=11f546fc-7a81-4975-9415-d27b7f14bd30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:47.653 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 11f546fc-7a81-4975-9415-d27b7f14bd30 in datapath b22db185-b399-445e-82e5-2a4ef0dadc29 unbound from our chassis#033[00m
Nov 29 02:06:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:47.655 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b22db185-b399-445e-82e5-2a4ef0dadc29, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:06:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:47.656 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2f03a7-16cf-4a4a-986e-d7b1b3fefda0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:47.657 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29 namespace which is not needed anymore#033[00m
Nov 29 02:06:47 np0005539504 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Nov 29 02:06:47 np0005539504 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000004e.scope: Consumed 3.314s CPU time.
Nov 29 02:06:47 np0005539504 systemd-machined[153423]: Machine qemu-42-instance-0000004e terminated.
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.873 187156 INFO nova.virt.libvirt.driver [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Instance destroyed successfully.#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.874 187156 DEBUG nova.objects.instance [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lazy-loading 'resources' on Instance uuid 059e8710-61a4-4010-ae59-3b17604e8cd2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:47 np0005539504 nova_compute[187152]: 2025-11-29 07:06:47.935 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:47.967 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '059e8710-61a4-4010-ae59-3b17604e8cd2', 'name': 'tempest-InstanceActionsNegativeTestJSON-server-1891149216', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '68b64b8a24d349adb1e6e5d6822784fc', 'user_id': '51cb6e189f2c4d608d37ef464a33540e', 'hostId': '02529cfae4a2300f6975229f5667585c9a7686d01f2ca559d16c9ddc', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:47.971 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'name': 'tempest-ServerActionsTestJSON-server-664171356', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000042', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e6c366001df43fb91731faf7a9578fc', 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'hostId': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:47.971 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:47.973 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:47.999 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.000 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '498b15bd-5905-4446-9dcd-9ec8cec4a514', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:47.971772', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b0c36e-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': 'f9aeb05566d739d0cd91569b76e921f3a9ad636ca4c0018a1a8b7ba0b49565cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:47.971772', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b0d7c8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': 'b1d7d0eaa6b5c66c62e2ecae7a1601b237bcaa712c4bf31f30f605b91bc72c1a'}]}, 'timestamp': '2025-11-29 07:06:48.001095', '_unique_id': '8b6194c657ad46a59121008c58e52aee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.003 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.005 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.006 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.009 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '519d25a7-3405-4785-9d91-9c034913573b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.005203', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b22d08-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': '3894ee17d0939c482cc37f025e8c1b621ca33f297ce05d0990434dfd5f970b34'}]}, 'timestamp': '2025-11-29 07:06:48.009833', '_unique_id': '87f159dc57014b5bb28cc409b1f1d675'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.011 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.012 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.012 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.latency volume: 25392819 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.012 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f6f2e2-6a1a-4cc4-928a-0926f57bed2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25392819, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.011653', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b29bb2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': '03d6630b7d0e3d238cc1a1fb5d51911efe55dad118ee929241d1f55fb22cc386'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.011653', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b2a738-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': 'ef29006eb7f3ec2fc14219cf19614092a90e7cd9ab042e5b015dc21d9abf5cc2'}]}, 'timestamp': '2025-11-29 07:06:48.012917', '_unique_id': 'c4ce624f1af945c393b52e66cd7a112c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.013 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.014 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.014 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>]
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.015 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.015 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.bytes volume: 31922176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.016 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d8928ec-cd71-42c3-95fd-00c520cc1062', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31922176, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.015009', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b32370-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': 'be0e2048a99060c31294fea91a651b0efe85767e658586337dbe3450a8952df2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.015009', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b32ed8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': 'c19ce4afc948955df5fe43f51369534091bcfd9d516cd9cc73c3a628be24b660'}]}, 'timestamp': '2025-11-29 07:06:48.016427', '_unique_id': '7963ef84af6e492dbf36f4147e7674b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.018 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.018 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.bytes volume: 1278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '226706e9-d94b-433f-acc2-444738cab65b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1278, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.017993', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b3908a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': 'a53ba4c39df5eeec5ee7eeb4791ee1fa3ba8acecc3d6b362de859d7e5d6abdb3'}]}, 'timestamp': '2025-11-29 07:06:48.018909', '_unique_id': 'b5d82ce1a8044f08ab73d52c530765b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.019 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.020 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.020 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.021 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.requests volume: 1177 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.021 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04960b88-2134-48ed-8e34-d83b6405f746', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1177, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.020437', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b3ef94-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': '49047ba5491ab634ef1fa9a5c8ffb45041cf941a5dd23bd2ee79ea637b9d2329'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.020437', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b3fbec-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': '230d5bcf3b4f5de1d73ef8142fe96897c7357986b34e0b832411957c6a8e419b'}]}, 'timestamp': '2025-11-29 07:06:48.021639', '_unique_id': 'cfa829b884c740d9a1c5e7323713d779'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.022 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.023 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.023 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bad7124-944b-4394-8cd9-05b43148a4f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.023150', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b45b8c-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': 'c31d5c2407add04ccea44839cef489ec3bdf43c461e512d8efb38d264b808c18'}]}, 'timestamp': '2025-11-29 07:06:48.024102', '_unique_id': 'c58f9bfd4a534466a583d2d06f81d134'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.024 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.026 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.026 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.latency volume: 248328818 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.026 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.read.latency volume: 22127175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f1b4c47-23e3-4911-a3c5-c4768df6df27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 248328818, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.025624', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b4b9e2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': 'e1692ff6d16beb029c529e9b6db8d10b2b7dc2dc0e17554e75a640de70913a66'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22127175, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.025624', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b4c658-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': '2c21e6ec601dd382deb715cb6fadd35322d12fd8baf048d53efda992ccb86a72'}]}, 'timestamp': '2025-11-29 07:06:48.026820', '_unique_id': '21ff05bf7e3642cdbf4526e752f0e609'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.028 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.038 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.usage volume: 30474240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.038 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e96c231-0553-4fcb-ab23-ea4e9fef644d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30474240, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.028332', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b69bcc-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.963619906, 'message_signature': '04bb278e5753972862005fdb46fa3f2235f1d8bc6c12895a81f9c4fd3a6733a9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.028332', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b6a7f2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.963619906, 'message_signature': '2b105fff065947f6b8e7892eac3c9e8384360aa80ef21c837fe9d0622fac5a9c'}]}, 'timestamp': '2025-11-29 07:06:48.039155', '_unique_id': '4ab8b95601834dc2bd0cae1512aed3b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.040 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.041 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.041 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.bytes volume: 278528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.041 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06e7f9d3-c79f-442e-8e9b-8fab2d602c7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 278528, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.040984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b715ac-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': '45bc907f4692214e252fc027a4818ce772d34b6f67e5b8b1cc3fb480185e797e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.040984', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b720ec-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.90804571, 'message_signature': '3b532c838983f53fc0e641c28e12738fc73633d57b188183b2a1be5112097c83'}]}, 'timestamp': '2025-11-29 07:06:48.042247', '_unique_id': 'cc72f9ce92c4469f90504d7447f89e1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.044 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.044 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.allocation volume: 31334400 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.044 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33fe73c2-a5dd-43d9-86f9-3495c9b95896', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31334400, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.043815', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b78744-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.963619906, 'message_signature': 'd20d7b2408281ef42723ba32cded6611132265fe0d371b06c31401d324f59720'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.043815', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b79266-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.963619906, 'message_signature': '22cb52251647f9a4623b68a959b22a620afa091735806bc48aa78231acdf7346'}]}, 'timestamp': '2025-11-29 07:06:48.045150', '_unique_id': 'f34d079709e74f19bf844921db767cbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.046 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.047 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.047 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a0b1d47-5f86-4170-bded-e0d1b60e6ecc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.047086', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b80ae8-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': 'bbbc59229669ec63fc68109021716302dd943bfc2574d3ab829ffa5d8eaf5d3f'}]}, 'timestamp': '2025-11-29 07:06:48.048260', '_unique_id': '7f727493547c4c16944889b6e3d45dee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.049 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.049 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>]
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.050 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.050 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ac2a258-ee6e-4416-ae6d-4997d44d828d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.050224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b87e06-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': '5aecf391b1cbb58272edcff85170b2a8b0d7e2702ce753f5955714a15da2698e'}]}, 'timestamp': '2025-11-29 07:06:48.051206', '_unique_id': 'c35afb7d9c6349c2959d1a90ef2cbcb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.053 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.053 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '249e74a0-b9e5-4b24-9a47-75c9aa63d814', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.052754', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b8dfae-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': 'fa792f3d93bd9fc93e1de266dd198f8c72ff597b341a131f43e066af491b77dc'}]}, 'timestamp': '2025-11-29 07:06:48.053703', '_unique_id': '3f120b1379d54a1c8832602221b3867b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.055 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.055 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>]
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.056 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.056 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88b840a9-1a04-4004-a3f3-0b554db64000', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.055593', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8b94ca0-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': '9aa682df2a3d2ffd717fddbf0d78fb63a57ca8a32fc5bc20e493767f7d7509e1'}]}, 'timestamp': '2025-11-29 07:06:48.056531', '_unique_id': '90e19601638340fcb69efa9ede033328'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.058 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.058 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsNegativeTestJSON-server-1891149216>]
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.059 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.059 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.059 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a77a8262-f9b1-4d5a-b05d-3b493d118c3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-vda', 'timestamp': '2025-11-29T07:06:48.058399', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8b9c0f4-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.963619906, 'message_signature': '50664e1a1ffd3d05092799acb673d445fac634511cec87e7d941c9d68d87e5e4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258-sda', 'timestamp': '2025-11-29T07:06:48.058399', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8b9cde2-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.963619906, 'message_signature': '3cbf6324d868b3f5f8427ae990f48f0e503353fd655566855aa42a355c3e8ca1'}]}, 'timestamp': '2025-11-29 07:06:48.059782', '_unique_id': '13c8bd03420f4be199e256d98143f402'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.061 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.bytes volume: 1357 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39ec25ba-940d-4c9d-a3cd-eb123356575c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1357, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.061313', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8ba2fda-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': '16d191adab732e0223d164f2541c5e55933d13b87bb1e19a8374d2774e1fb252'}]}, 'timestamp': '2025-11-29 07:06:48.062306', '_unique_id': 'b95b7c7b99e34bd1821e78b247d99020'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.063 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.064 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [NOTICE]   (226771) : haproxy version is 2.8.14-c23fe91
Nov 29 02:06:48 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [NOTICE]   (226771) : path to executable is /usr/sbin/haproxy
Nov 29 02:06:48 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [WARNING]  (226771) : Exiting Master process...
Nov 29 02:06:48 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [WARNING]  (226771) : Exiting Master process...
Nov 29 02:06:48 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [ALERT]    (226771) : Current worker (226773) exited with code 143 (Terminated)
Nov 29 02:06:48 np0005539504 neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29[226767]: [WARNING]  (226771) : All workers exited. Exiting... (0)
Nov 29 02:06:48 np0005539504 systemd[1]: libpod-137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7.scope: Deactivated successfully.
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.076 187156 DEBUG nova.virt.libvirt.vif [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:06:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-1891149216',display_name='tempest-InstanceActionsNegativeTestJSON-server-1891149216',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-1891149216',id=78,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:06:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='68b64b8a24d349adb1e6e5d6822784fc',ramdisk_id='',reservation_id='r-xd3dr0rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1535726508',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1535726508-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:45Z,user_data=None,user_id='51cb6e189f2c4d608d37ef464a33540e',uuid=059e8710-61a4-4010-ae59-3b17604e8cd2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.076 187156 DEBUG nova.network.os_vif_util [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Converting VIF {"id": "11f546fc-7a81-4975-9415-d27b7f14bd30", "address": "fa:16:3e:6b:71:06", "network": {"id": "b22db185-b399-445e-82e5-2a4ef0dadc29", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-322267650-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "68b64b8a24d349adb1e6e5d6822784fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11f546fc-7a", "ovs_interfaceid": "11f546fc-7a81-4975-9415-d27b7f14bd30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.077 187156 DEBUG nova.network.os_vif_util [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.078 187156 DEBUG os_vif [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.082 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11f546fc-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.083 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:48 np0005539504 podman[226806]: 2025-11-29 07:06:48.08412946 +0000 UTC m=+0.320821666 container died 137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.086 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.087 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/memory.usage volume: 42.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.088 187156 INFO os_vif [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:71:06,bridge_name='br-int',has_traffic_filtering=True,id=11f546fc-7a81-4975-9415-d27b7f14bd30,network=Network(b22db185-b399-445e-82e5-2a4ef0dadc29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11f546fc-7a')#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.089 187156 INFO nova.virt.libvirt.driver [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Deleting instance files /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2_del#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.089 187156 INFO nova.virt.libvirt.driver [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Deletion of /var/lib/nova/instances/059e8710-61a4-4010-ae59-3b17604e8cd2_del complete#033[00m
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d26f13b-072d-459a-86d9-32610aa0a0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.35546875, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'timestamp': '2025-11-29T07:06:48.063875', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f8be2306-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5471.021990057, 'message_signature': '4b0af5ed891cb4c07b937c53995addd111153de4d88c76bb548b7181d450d530'}]}, 'timestamp': '2025-11-29 07:06:48.088337', '_unique_id': '745c6b4aaba841efba9f28055fe99e82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.093 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.094 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.incoming.bytes.delta volume: 1267 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4ce5931-ef4e-4b8c-b53e-7b6ee917b03b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1267, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.092814', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8bf23fa-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': '745630e295656f88173ecd7fe080ec655350d4188ee1d2a909cbc7b85cd32ebe'}]}, 'timestamp': '2025-11-29 07:06:48.094878', '_unique_id': '24f0c77c026c4933bf367b00e863bc69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.099 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.099 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/cpu volume: 11780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3277f4c4-85c3-4d4a-ba18-4c4a26d27071', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11780000000, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'timestamp': '2025-11-29T07:06:48.098334', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'instance-00000042', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f8bff316-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5471.021990057, 'message_signature': 'def7b278081182a9703b079e5767d68bf4a20a375794e749b0f7fef75e0cde49'}]}, 'timestamp': '2025-11-29 07:06:48.100159', '_unique_id': '236bce3065164ec3b67f1624ffe400dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.105 12 DEBUG ceilometer.compute.pollsters [-] Instance 059e8710-61a4-4010-ae59-3b17604e8cd2 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000004e, id=059e8710-61a4-4010-ae59-3b17604e8cd2>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.105 12 DEBUG ceilometer.compute.pollsters [-] 9223f44a-297e-4db1-9f44-ee0694c4e258/network.outgoing.bytes.delta volume: 1278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da49b4a8-f160-42bb-baab-3023c803c293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1278, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-00000042-9223f44a-297e-4db1-9f44-ee0694c4e258-tapb7078e73-f0', 'timestamp': '2025-11-29T07:06:48.104824', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-664171356', 'name': 'tapb7078e73-f0', 'instance_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'instance_type': 'm1.nano', 'host': '6a46bc7c19e93a4f3d1f615b51fea40a81bcd66aedb7f12641fd55ac', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1e:a3:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb7078e73-f0'}, 'message_id': 'f8c0e05a-ccf1-11f0-8a11-fa163ea726b4', 'monotonic_time': 5470.940868374, 'message_signature': 'fcb107702f4e1370d88db8f43f8749ba62a4b1b00ff5e84587aaf2e261f398db'}]}, 'timestamp': '2025-11-29 07:06:48.106192', '_unique_id': '4f68523d40f340638e930a1cb54a6dbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:06:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7-userdata-shm.mount: Deactivated successfully.
Nov 29 02:06:48 np0005539504 systemd[1]: var-lib-containers-storage-overlay-c705506d0a5e079aaa64396d6e046efe281262a51fdbeb8b3ab415ab2c6816d7-merged.mount: Deactivated successfully.
Nov 29 02:06:48 np0005539504 podman[226806]: 2025-11-29 07:06:48.302924009 +0000 UTC m=+0.539616225 container cleanup 137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:06:48 np0005539504 systemd[1]: libpod-conmon-137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7.scope: Deactivated successfully.
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.351 187156 INFO nova.compute.manager [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.352 187156 DEBUG oslo.service.loopingcall [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.352 187156 DEBUG nova.compute.manager [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.352 187156 DEBUG nova.network.neutron [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:06:48 np0005539504 podman[226835]: 2025-11-29 07:06:48.373513958 +0000 UTC m=+0.484838769 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:06:48 np0005539504 podman[226877]: 2025-11-29 07:06:48.384181136 +0000 UTC m=+0.051148348 container remove 137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.388 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdfc1b5-d6d7-48ef-b163-f356b55d42a0]: (4, ('Sat Nov 29 07:06:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29 (137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7)\n137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7\nSat Nov 29 07:06:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29 (137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7)\n137d4beeebcb6d63262c576a73559799d758dd9a15b0989a91e2050f6faee0e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.390 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a1f31c-4c7a-431f-95c5-1d86615233b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.391 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb22db185-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:48 np0005539504 kernel: tapb22db185-b0: left promiscuous mode
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.393 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.403 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.407 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[de15055a-0702-48aa-97f0-3d6481567355]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.426 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4ad9a0-deb0-4d41-9567-b7f08fb7d516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.427 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9c71b1a5-14e3-4d53-aa44-12823d104fd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 podman[226837]: 2025-11-29 07:06:48.43189116 +0000 UTC m=+0.541267539 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.444 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aed08dbe-21d4-4e87-9034-320e778ae0fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546674, 'reachable_time': 31669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226919, 'error': None, 'target': 'ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.447 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b22db185-b399-445e-82e5-2a4ef0dadc29 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:06:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:06:48.447 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[78f21a63-f4dd-4472-bdec-ee394b88bea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:48 np0005539504 systemd[1]: run-netns-ovnmeta\x2db22db185\x2db399\x2d445e\x2d82e5\x2d2a4ef0dadc29.mount: Deactivated successfully.
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:48 np0005539504 nova_compute[187152]: 2025-11-29 07:06:48.986 187156 DEBUG nova.network.neutron [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.002 187156 INFO nova.compute.manager [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Took 0.65 seconds to deallocate network for instance.#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.081 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.082 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.160 187156 DEBUG nova.compute.provider_tree [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.176 187156 DEBUG nova.scheduler.client.report [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.206 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.230 187156 INFO nova.scheduler.client.report [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Deleted allocations for instance 059e8710-61a4-4010-ae59-3b17604e8cd2#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.304 187156 DEBUG oslo_concurrency.lockutils [None req-1cedd066-2e85-4c4b-9d93-4de9302ea2c3 51cb6e189f2c4d608d37ef464a33540e 68b64b8a24d349adb1e6e5d6822784fc - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.332 187156 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received event network-vif-unplugged-11f546fc-7a81-4975-9415-d27b7f14bd30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.333 187156 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.333 187156 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.333 187156 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.333 187156 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] No waiting events found dispatching network-vif-unplugged-11f546fc-7a81-4975-9415-d27b7f14bd30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.333 187156 WARNING nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received unexpected event network-vif-unplugged-11f546fc-7a81-4975-9415-d27b7f14bd30 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.333 187156 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.334 187156 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.334 187156 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.334 187156 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "059e8710-61a4-4010-ae59-3b17604e8cd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.334 187156 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] No waiting events found dispatching network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.334 187156 WARNING nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received unexpected event network-vif-plugged-11f546fc-7a81-4975-9415-d27b7f14bd30 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.334 187156 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Received event network-vif-deleted-11f546fc-7a81-4975-9415-d27b7f14bd30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:49 np0005539504 nova_compute[187152]: 2025-11-29 07:06:49.861 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:50 np0005539504 nova_compute[187152]: 2025-11-29 07:06:50.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:52 np0005539504 nova_compute[187152]: 2025-11-29 07:06:52.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:52 np0005539504 nova_compute[187152]: 2025-11-29 07:06:52.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:53 np0005539504 nova_compute[187152]: 2025-11-29 07:06:53.113 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:54 np0005539504 nova_compute[187152]: 2025-11-29 07:06:54.862 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:54 np0005539504 nova_compute[187152]: 2025-11-29 07:06:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:54 np0005539504 nova_compute[187152]: 2025-11-29 07:06:54.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:06:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:06:55Z|00289|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:06:55 np0005539504 nova_compute[187152]: 2025-11-29 07:06:55.347 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:55 np0005539504 nova_compute[187152]: 2025-11-29 07:06:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:55 np0005539504 nova_compute[187152]: 2025-11-29 07:06:55.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:06:55 np0005539504 nova_compute[187152]: 2025-11-29 07:06:55.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:06:56 np0005539504 nova_compute[187152]: 2025-11-29 07:06:56.371 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:56 np0005539504 nova_compute[187152]: 2025-11-29 07:06:56.372 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:56 np0005539504 nova_compute[187152]: 2025-11-29 07:06:56.372 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:06:56 np0005539504 nova_compute[187152]: 2025-11-29 07:06:56.373 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:57 np0005539504 podman[226922]: 2025-11-29 07:06:57.726482348 +0000 UTC m=+0.064532677 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.115 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.554 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.572 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.572 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.573 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.573 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.594 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.595 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.595 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.596 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.673 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.730 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.731 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.797 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.951 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.953 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5507MB free_disk=73.16327667236328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.953 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:58 np0005539504 nova_compute[187152]: 2025-11-29 07:06:58.954 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.090 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 9223f44a-297e-4db1-9f44-ee0694c4e258 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.091 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.091 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.195 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.210 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.239 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.239 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.240 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:59 np0005539504 nova_compute[187152]: 2025-11-29 07:06:59.864 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:01 np0005539504 podman[226949]: 2025-11-29 07:07:01.731297941 +0000 UTC m=+0.071154565 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:07:02 np0005539504 nova_compute[187152]: 2025-11-29 07:07:02.871 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400007.8703117, 059e8710-61a4-4010-ae59-3b17604e8cd2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:02 np0005539504 nova_compute[187152]: 2025-11-29 07:07:02.872 187156 INFO nova.compute.manager [-] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:07:02 np0005539504 nova_compute[187152]: 2025-11-29 07:07:02.911 187156 DEBUG nova.compute.manager [None req-b247a700-be02-4c54-85d3-054608c4faa3 - - - - - -] [instance: 059e8710-61a4-4010-ae59-3b17604e8cd2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:03 np0005539504 nova_compute[187152]: 2025-11-29 07:07:03.118 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:03 np0005539504 nova_compute[187152]: 2025-11-29 07:07:03.951 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:03 np0005539504 nova_compute[187152]: 2025-11-29 07:07:03.951 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:07:04 np0005539504 nova_compute[187152]: 2025-11-29 07:07:04.539 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:04 np0005539504 nova_compute[187152]: 2025-11-29 07:07:04.867 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:07 np0005539504 nova_compute[187152]: 2025-11-29 07:07:07.816 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539504 nova_compute[187152]: 2025-11-29 07:07:08.120 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:08Z|00290|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:07:08 np0005539504 nova_compute[187152]: 2025-11-29 07:07:08.573 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539504 nova_compute[187152]: 2025-11-29 07:07:08.965 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:08 np0005539504 nova_compute[187152]: 2025-11-29 07:07:08.966 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:07:08 np0005539504 nova_compute[187152]: 2025-11-29 07:07:08.983 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:07:09 np0005539504 nova_compute[187152]: 2025-11-29 07:07:09.868 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:11 np0005539504 podman[226970]: 2025-11-29 07:07:11.733337689 +0000 UTC m=+0.077196529 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.034 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.034 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.050 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.164 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.165 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.170 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.170 187156 INFO nova.compute.claims [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.373 187156 DEBUG nova.compute.provider_tree [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.618 187156 DEBUG nova.scheduler.client.report [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.660 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.661 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.767 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.768 187156 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.808 187156 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.835 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.990 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.991 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.992 187156 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Creating image(s)#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.993 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "/var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.994 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "/var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:12 np0005539504 nova_compute[187152]: 2025-11-29 07:07:12.995 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "/var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.023 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.052 187156 DEBUG nova.policy [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c9a3fa9f480479d98f522f6f02870fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '477b89fb35da42f69c15b3f01054754a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.108 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.109 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.110 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.126 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.146 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.199 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.201 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.236 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.238 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.238 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.296 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.298 187156 DEBUG nova.virt.disk.api [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Checking if we can resize image /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.298 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.354 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.355 187156 DEBUG nova.virt.disk.api [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Cannot resize image /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:07:13 np0005539504 nova_compute[187152]: 2025-11-29 07:07:13.355 187156 DEBUG nova.objects.instance [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'migration_context' on Instance uuid ec5471fe-7a56-409a-b585-d51f3a74cc38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:13 np0005539504 podman[227004]: 2025-11-29 07:07:13.725285029 +0000 UTC m=+0.065535205 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:07:13 np0005539504 podman[227005]: 2025-11-29 07:07:13.750705263 +0000 UTC m=+0.078869644 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64)
Nov 29 02:07:14 np0005539504 nova_compute[187152]: 2025-11-29 07:07:14.314 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:07:14 np0005539504 nova_compute[187152]: 2025-11-29 07:07:14.315 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Ensure instance console log exists: /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:07:14 np0005539504 nova_compute[187152]: 2025-11-29 07:07:14.316 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:14 np0005539504 nova_compute[187152]: 2025-11-29 07:07:14.316 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:14 np0005539504 nova_compute[187152]: 2025-11-29 07:07:14.317 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:14 np0005539504 nova_compute[187152]: 2025-11-29 07:07:14.870 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:18 np0005539504 nova_compute[187152]: 2025-11-29 07:07:18.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:18 np0005539504 podman[227047]: 2025-11-29 07:07:18.769500994 +0000 UTC m=+0.099943631 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:07:18 np0005539504 podman[227048]: 2025-11-29 07:07:18.797358494 +0000 UTC m=+0.123083364 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:07:19 np0005539504 nova_compute[187152]: 2025-11-29 07:07:19.875 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:19 np0005539504 nova_compute[187152]: 2025-11-29 07:07:19.935 187156 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Successfully created port: 3d930a4f-6014-4f77-a512-d62b043e4358 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:07:21 np0005539504 nova_compute[187152]: 2025-11-29 07:07:21.872 187156 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Successfully updated port: 3d930a4f-6014-4f77-a512-d62b043e4358 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:07:21 np0005539504 nova_compute[187152]: 2025-11-29 07:07:21.908 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "refresh_cache-ec5471fe-7a56-409a-b585-d51f3a74cc38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:21 np0005539504 nova_compute[187152]: 2025-11-29 07:07:21.908 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquired lock "refresh_cache-ec5471fe-7a56-409a-b585-d51f3a74cc38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:21 np0005539504 nova_compute[187152]: 2025-11-29 07:07:21.909 187156 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:07:22 np0005539504 nova_compute[187152]: 2025-11-29 07:07:22.045 187156 DEBUG nova.compute.manager [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-changed-3d930a4f-6014-4f77-a512-d62b043e4358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:22 np0005539504 nova_compute[187152]: 2025-11-29 07:07:22.046 187156 DEBUG nova.compute.manager [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Refreshing instance network info cache due to event network-changed-3d930a4f-6014-4f77-a512-d62b043e4358. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:07:22 np0005539504 nova_compute[187152]: 2025-11-29 07:07:22.046 187156 DEBUG oslo_concurrency.lockutils [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ec5471fe-7a56-409a-b585-d51f3a74cc38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:22 np0005539504 nova_compute[187152]: 2025-11-29 07:07:22.187 187156 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:07:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:22.923 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:22.924 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:22.925 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:23 np0005539504 nova_compute[187152]: 2025-11-29 07:07:23.152 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:24 np0005539504 nova_compute[187152]: 2025-11-29 07:07:24.876 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.601 187156 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Updating instance_info_cache with network_info: [{"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:25Z|00291|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.684 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Releasing lock "refresh_cache-ec5471fe-7a56-409a-b585-d51f3a74cc38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.685 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Instance network_info: |[{"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.685 187156 DEBUG oslo_concurrency.lockutils [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ec5471fe-7a56-409a-b585-d51f3a74cc38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.685 187156 DEBUG nova.network.neutron [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Refreshing network info cache for port 3d930a4f-6014-4f77-a512-d62b043e4358 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.688 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Start _get_guest_xml network_info=[{"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.693 187156 WARNING nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.699 187156 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.700 187156 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.704 187156 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.705 187156 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.706 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.707 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.707 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.707 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.707 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.708 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.708 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.708 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.709 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.709 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.709 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.709 187156 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.713 187156 DEBUG nova.virt.libvirt.vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-1',id=81,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:12Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=ec5471fe-7a56-409a-b585-d51f3a74cc38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.714 187156 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.715 187156 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.717 187156 DEBUG nova.objects.instance [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'pci_devices' on Instance uuid ec5471fe-7a56-409a-b585-d51f3a74cc38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.719 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.731 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <uuid>ec5471fe-7a56-409a-b585-d51f3a74cc38</uuid>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <name>instance-00000051</name>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1009215778-1</nova:name>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:07:25</nova:creationTime>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:user uuid="3c9a3fa9f480479d98f522f6f02870fb">tempest-ListServersNegativeTestJSON-316367608-project-member</nova:user>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:project uuid="477b89fb35da42f69c15b3f01054754a">tempest-ListServersNegativeTestJSON-316367608</nova:project>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        <nova:port uuid="3d930a4f-6014-4f77-a512-d62b043e4358">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <entry name="serial">ec5471fe-7a56-409a-b585-d51f3a74cc38</entry>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <entry name="uuid">ec5471fe-7a56-409a-b585-d51f3a74cc38</entry>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.config"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:c8:0b:b6"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <target dev="tap3d930a4f-60"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/console.log" append="off"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:07:25 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:07:25 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:07:25 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:07:25 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.732 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Preparing to wait for external event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.733 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.734 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.734 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.736 187156 DEBUG nova.virt.libvirt.vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-1',id=81,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:12Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=ec5471fe-7a56-409a-b585-d51f3a74cc38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.736 187156 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.737 187156 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.738 187156 DEBUG os_vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.739 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.740 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.740 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.744 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.745 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3d930a4f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.745 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3d930a4f-60, col_values=(('external_ids', {'iface-id': '3d930a4f-6014-4f77-a512-d62b043e4358', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:0b:b6', 'vm-uuid': 'ec5471fe-7a56-409a-b585-d51f3a74cc38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.780 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539504 NetworkManager[55210]: <info>  [1764400045.7820] manager: (tap3d930a4f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.784 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.789 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.791 187156 INFO os_vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60')#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.852 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.853 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.853 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No VIF found with MAC fa:16:3e:c8:0b:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:07:25 np0005539504 nova_compute[187152]: 2025-11-29 07:07:25.854 187156 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Using config drive#033[00m
Nov 29 02:07:26 np0005539504 nova_compute[187152]: 2025-11-29 07:07:26.471 187156 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Creating config drive at /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.config#033[00m
Nov 29 02:07:26 np0005539504 nova_compute[187152]: 2025-11-29 07:07:26.478 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg5n03dke execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:26 np0005539504 nova_compute[187152]: 2025-11-29 07:07:26.617 187156 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg5n03dke" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:26 np0005539504 kernel: tap3d930a4f-60: entered promiscuous mode
Nov 29 02:07:26 np0005539504 NetworkManager[55210]: <info>  [1764400046.7143] manager: (tap3d930a4f-60): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Nov 29 02:07:26 np0005539504 nova_compute[187152]: 2025-11-29 07:07:26.715 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:26Z|00292|binding|INFO|Claiming lport 3d930a4f-6014-4f77-a512-d62b043e4358 for this chassis.
Nov 29 02:07:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:26Z|00293|binding|INFO|3d930a4f-6014-4f77-a512-d62b043e4358: Claiming fa:16:3e:c8:0b:b6 10.100.0.7
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.726 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:0b:b6 10.100.0.7'], port_security=['fa:16:3e:c8:0b:b6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ec5471fe-7a56-409a-b585-d51f3a74cc38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '477b89fb35da42f69c15b3f01054754a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d87673c-3e8a-46ed-9956-50ea661306ab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114ba21f-c978-4c05-97f9-429aa66017b7, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=3d930a4f-6014-4f77-a512-d62b043e4358) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:26Z|00294|binding|INFO|Setting lport 3d930a4f-6014-4f77-a512-d62b043e4358 ovn-installed in OVS
Nov 29 02:07:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:26Z|00295|binding|INFO|Setting lport 3d930a4f-6014-4f77-a512-d62b043e4358 up in Southbound
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.727 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 3d930a4f-6014-4f77-a512-d62b043e4358 in datapath c8a3c675-42f5-48a4-83d7-2d39dd3304b9 bound to our chassis#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.729 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8a3c675-42f5-48a4-83d7-2d39dd3304b9#033[00m
Nov 29 02:07:26 np0005539504 nova_compute[187152]: 2025-11-29 07:07:26.729 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:26 np0005539504 nova_compute[187152]: 2025-11-29 07:07:26.733 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.745 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[01402547-1c62-4ef2-b6ab-673d2bcb484a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.747 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8a3c675-41 in ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.751 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8a3c675-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.751 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6265c6ba-4544-46ec-a165-f59f580e8ce1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.752 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8d085fef-8fe5-4a6d-b09d-45b51a47fef4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 systemd-udevd[227114]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.767 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[10747591-e605-4e27-8079-03fa983df137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 systemd-machined[153423]: New machine qemu-43-instance-00000051.
Nov 29 02:07:26 np0005539504 NetworkManager[55210]: <info>  [1764400046.7783] device (tap3d930a4f-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:07:26 np0005539504 NetworkManager[55210]: <info>  [1764400046.7794] device (tap3d930a4f-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.784 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f93b77a2-1a8a-42c3-a871-4d8f2e4f22c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 systemd[1]: Started Virtual Machine qemu-43-instance-00000051.
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.826 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2ee4d1-c577-4c8e-a603-0ba485b0c172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 NetworkManager[55210]: <info>  [1764400046.8361] manager: (tapc8a3c675-40): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.835 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7c53a701-c9f7-4636-97e3-7f389ce4455b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 systemd-udevd[227119]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.875 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[28031e33-c971-445c-a5ca-a47055b51eeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.880 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d6fdd08b-65d4-44e5-acb0-a63226e7ee8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 NetworkManager[55210]: <info>  [1764400046.9103] device (tapc8a3c675-40): carrier: link connected
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.919 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe80034-2139-440c-8323-d101a7e36b62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.936 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e9236728-975c-457f-b434-b719d14caad4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8a3c675-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:f6:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550978, 'reachable_time': 38111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227147, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.956 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ae6325fe-ca0b-4e8f-9996-2b5de3b059a0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:f663'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550978, 'tstamp': 550978}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227148, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:26.982 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3044dc98-2be0-4219-8408-5ebbeb9e0270]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8a3c675-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:f6:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550978, 'reachable_time': 38111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227149, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.022 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b6461d88-1b5f-45e3-b7e2-0fcc5dc2c881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.084 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f52ffe-b4db-4cd3-a9eb-0177197ffdd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.086 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a3c675-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.087 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.087 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8a3c675-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:27 np0005539504 kernel: tapc8a3c675-40: entered promiscuous mode
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.089 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:27 np0005539504 NetworkManager[55210]: <info>  [1764400047.0901] manager: (tapc8a3c675-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.093 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8a3c675-40, col_values=(('external_ids', {'iface-id': '2a5ced08-2785-4bf9-8fa1-c89240d15794'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:27 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:27Z|00296|binding|INFO|Releasing lport 2a5ced08-2785-4bf9-8fa1-c89240d15794 from this chassis (sb_readonly=0)
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.095 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.109 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.111 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.113 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e806f982-659a-4457-aa96-a70aa7571e1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.114 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:07:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:27.114 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'env', 'PROCESS_TAG=haproxy-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.142 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400047.1419165, ec5471fe-7a56-409a-b585-d51f3a74cc38 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.143 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] VM Started (Lifecycle Event)#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.438 187156 DEBUG nova.compute.manager [req-c213de95-5975-45cf-9cfb-8f18ea803c67 req-291efcc0-d9dd-427b-8325-7c85658ffa97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.439 187156 DEBUG oslo_concurrency.lockutils [req-c213de95-5975-45cf-9cfb-8f18ea803c67 req-291efcc0-d9dd-427b-8325-7c85658ffa97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.439 187156 DEBUG oslo_concurrency.lockutils [req-c213de95-5975-45cf-9cfb-8f18ea803c67 req-291efcc0-d9dd-427b-8325-7c85658ffa97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.439 187156 DEBUG oslo_concurrency.lockutils [req-c213de95-5975-45cf-9cfb-8f18ea803c67 req-291efcc0-d9dd-427b-8325-7c85658ffa97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.439 187156 DEBUG nova.compute.manager [req-c213de95-5975-45cf-9cfb-8f18ea803c67 req-291efcc0-d9dd-427b-8325-7c85658ffa97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Processing event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.440 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.446 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.451 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.452 187156 INFO nova.virt.libvirt.driver [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Instance spawned successfully.#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.453 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.458 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.556 187156 DEBUG nova.network.neutron [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Updated VIF entry in instance network info cache for port 3d930a4f-6014-4f77-a512-d62b043e4358. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:07:27 np0005539504 nova_compute[187152]: 2025-11-29 07:07:27.557 187156 DEBUG nova.network.neutron [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Updating instance_info_cache with network_info: [{"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:27 np0005539504 podman[227185]: 2025-11-29 07:07:27.531364256 +0000 UTC m=+0.031063777 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:07:28 np0005539504 podman[227185]: 2025-11-29 07:07:28.399905831 +0000 UTC m=+0.899605352 container create 5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.421 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.422 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400047.1420972, ec5471fe-7a56-409a-b585-d51f3a74cc38 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.422 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.424 187156 DEBUG oslo_concurrency.lockutils [req-94d343e2-7fc3-4f1d-aebe-0c6b230d3c6a req-d585fef9-e6ee-4acc-8f4b-bc34b728f614 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ec5471fe-7a56-409a-b585-d51f3a74cc38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.427 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.427 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.427 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.428 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.428 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.429 187156 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:28 np0005539504 systemd[1]: Started libpod-conmon-5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3.scope.
Nov 29 02:07:28 np0005539504 podman[227198]: 2025-11-29 07:07:28.60534893 +0000 UTC m=+0.157310574 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:07:28 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:07:28 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/193d65551f00112425f1f41b31a3e4222b7902db7968862e5a8f69d76ef60824/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.673 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.680 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400047.4446673, ec5471fe-7a56-409a-b585-d51f3a74cc38 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.681 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:07:28 np0005539504 podman[227185]: 2025-11-29 07:07:28.692691092 +0000 UTC m=+1.192390613 container init 5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 29 02:07:28 np0005539504 podman[227185]: 2025-11-29 07:07:28.698817796 +0000 UTC m=+1.198517297 container start 5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:07:28 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [NOTICE]   (227225) : New worker (227227) forked
Nov 29 02:07:28 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [NOTICE]   (227225) : Loading success.
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.801 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.804 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.956 187156 INFO nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Took 15.97 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.956 187156 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:28 np0005539504 nova_compute[187152]: 2025-11-29 07:07:28.990 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.803 187156 DEBUG nova.compute.manager [req-a65162ff-2a89-45ed-a226-4291de3732ca req-ddb406cb-a43c-4f3d-bb5c-03ad82f41a92 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.803 187156 DEBUG oslo_concurrency.lockutils [req-a65162ff-2a89-45ed-a226-4291de3732ca req-ddb406cb-a43c-4f3d-bb5c-03ad82f41a92 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.804 187156 DEBUG oslo_concurrency.lockutils [req-a65162ff-2a89-45ed-a226-4291de3732ca req-ddb406cb-a43c-4f3d-bb5c-03ad82f41a92 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.805 187156 DEBUG oslo_concurrency.lockutils [req-a65162ff-2a89-45ed-a226-4291de3732ca req-ddb406cb-a43c-4f3d-bb5c-03ad82f41a92 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.805 187156 DEBUG nova.compute.manager [req-a65162ff-2a89-45ed-a226-4291de3732ca req-ddb406cb-a43c-4f3d-bb5c-03ad82f41a92 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] No waiting events found dispatching network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.805 187156 WARNING nova.compute.manager [req-a65162ff-2a89-45ed-a226-4291de3732ca req-ddb406cb-a43c-4f3d-bb5c-03ad82f41a92 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received unexpected event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:07:29 np0005539504 nova_compute[187152]: 2025-11-29 07:07:29.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:30 np0005539504 nova_compute[187152]: 2025-11-29 07:07:30.781 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:31 np0005539504 nova_compute[187152]: 2025-11-29 07:07:31.103 187156 INFO nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Took 18.99 seconds to build instance.#033[00m
Nov 29 02:07:31 np0005539504 nova_compute[187152]: 2025-11-29 07:07:31.707 187156 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:32 np0005539504 podman[227236]: 2025-11-29 07:07:32.743639526 +0000 UTC m=+0.076216893 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.493 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.494 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.494 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.494 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.495 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.591 187156 INFO nova.compute.manager [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Terminating instance#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.623 187156 DEBUG nova.compute.manager [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:07:33 np0005539504 kernel: tap3d930a4f-60 (unregistering): left promiscuous mode
Nov 29 02:07:33 np0005539504 NetworkManager[55210]: <info>  [1764400053.6491] device (tap3d930a4f-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:07:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:33Z|00297|binding|INFO|Releasing lport 3d930a4f-6014-4f77-a512-d62b043e4358 from this chassis (sb_readonly=0)
Nov 29 02:07:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:33Z|00298|binding|INFO|Setting lport 3d930a4f-6014-4f77-a512-d62b043e4358 down in Southbound
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.659 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:33Z|00299|binding|INFO|Removing iface tap3d930a4f-60 ovn-installed in OVS
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.663 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.674 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539504 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000051.scope: Deactivated successfully.
Nov 29 02:07:33 np0005539504 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000051.scope: Consumed 6.620s CPU time.
Nov 29 02:07:33 np0005539504 systemd-machined[153423]: Machine qemu-43-instance-00000051 terminated.
Nov 29 02:07:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:33.741 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:0b:b6 10.100.0.7'], port_security=['fa:16:3e:c8:0b:b6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ec5471fe-7a56-409a-b585-d51f3a74cc38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '477b89fb35da42f69c15b3f01054754a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d87673c-3e8a-46ed-9956-50ea661306ab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114ba21f-c978-4c05-97f9-429aa66017b7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=3d930a4f-6014-4f77-a512-d62b043e4358) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:33.743 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 3d930a4f-6014-4f77-a512-d62b043e4358 in datapath c8a3c675-42f5-48a4-83d7-2d39dd3304b9 unbound from our chassis#033[00m
Nov 29 02:07:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:33.745 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:07:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:33.746 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[587ce957-9abf-4ae7-9102-affdbe02ec50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:33.747 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 namespace which is not needed anymore#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.894 187156 INFO nova.virt.libvirt.driver [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Instance destroyed successfully.#033[00m
Nov 29 02:07:33 np0005539504 nova_compute[187152]: 2025-11-29 07:07:33.894 187156 DEBUG nova.objects.instance [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'resources' on Instance uuid ec5471fe-7a56-409a-b585-d51f3a74cc38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.129 187156 DEBUG nova.virt.libvirt.vif [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-1',id=81,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:07:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:07:29Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=ec5471fe-7a56-409a-b585-d51f3a74cc38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.130 187156 DEBUG nova.network.os_vif_util [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "3d930a4f-6014-4f77-a512-d62b043e4358", "address": "fa:16:3e:c8:0b:b6", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3d930a4f-60", "ovs_interfaceid": "3d930a4f-6014-4f77-a512-d62b043e4358", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.130 187156 DEBUG nova.network.os_vif_util [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.130 187156 DEBUG os_vif [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:07:34 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [NOTICE]   (227225) : haproxy version is 2.8.14-c23fe91
Nov 29 02:07:34 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [NOTICE]   (227225) : path to executable is /usr/sbin/haproxy
Nov 29 02:07:34 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [WARNING]  (227225) : Exiting Master process...
Nov 29 02:07:34 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [WARNING]  (227225) : Exiting Master process...
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.133 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.134 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3d930a4f-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:34 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [ALERT]    (227225) : Current worker (227227) exited with code 143 (Terminated)
Nov 29 02:07:34 np0005539504 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227217]: [WARNING]  (227225) : All workers exited. Exiting... (0)
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.136 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:34 np0005539504 systemd[1]: libpod-5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3.scope: Deactivated successfully.
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.140 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.143 187156 INFO os_vif [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:0b:b6,bridge_name='br-int',has_traffic_filtering=True,id=3d930a4f-6014-4f77-a512-d62b043e4358,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3d930a4f-60')#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.143 187156 INFO nova.virt.libvirt.driver [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Deleting instance files /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38_del#033[00m
Nov 29 02:07:34 np0005539504 podman[227281]: 2025-11-29 07:07:34.144134688 +0000 UTC m=+0.306169431 container died 5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.145 187156 INFO nova.virt.libvirt.driver [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Deletion of /var/lib/nova/instances/ec5471fe-7a56-409a-b585-d51f3a74cc38_del complete#033[00m
Nov 29 02:07:34 np0005539504 nova_compute[187152]: 2025-11-29 07:07:34.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3-userdata-shm.mount: Deactivated successfully.
Nov 29 02:07:34 np0005539504 systemd[1]: var-lib-containers-storage-overlay-193d65551f00112425f1f41b31a3e4222b7902db7968862e5a8f69d76ef60824-merged.mount: Deactivated successfully.
Nov 29 02:07:36 np0005539504 podman[227281]: 2025-11-29 07:07:36.403539367 +0000 UTC m=+2.565574080 container cleanup 5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:07:36 np0005539504 systemd[1]: libpod-conmon-5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3.scope: Deactivated successfully.
Nov 29 02:07:37 np0005539504 nova_compute[187152]: 2025-11-29 07:07:37.146 187156 DEBUG nova.compute.manager [req-8d02d542-6cf2-4bf6-848b-74d6287b72e5 req-1e839269-0645-40bf-8593-bf1821c97abe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-vif-unplugged-3d930a4f-6014-4f77-a512-d62b043e4358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:37 np0005539504 nova_compute[187152]: 2025-11-29 07:07:37.147 187156 DEBUG oslo_concurrency.lockutils [req-8d02d542-6cf2-4bf6-848b-74d6287b72e5 req-1e839269-0645-40bf-8593-bf1821c97abe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:37 np0005539504 nova_compute[187152]: 2025-11-29 07:07:37.148 187156 DEBUG oslo_concurrency.lockutils [req-8d02d542-6cf2-4bf6-848b-74d6287b72e5 req-1e839269-0645-40bf-8593-bf1821c97abe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:37 np0005539504 nova_compute[187152]: 2025-11-29 07:07:37.148 187156 DEBUG oslo_concurrency.lockutils [req-8d02d542-6cf2-4bf6-848b-74d6287b72e5 req-1e839269-0645-40bf-8593-bf1821c97abe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:37 np0005539504 nova_compute[187152]: 2025-11-29 07:07:37.149 187156 DEBUG nova.compute.manager [req-8d02d542-6cf2-4bf6-848b-74d6287b72e5 req-1e839269-0645-40bf-8593-bf1821c97abe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] No waiting events found dispatching network-vif-unplugged-3d930a4f-6014-4f77-a512-d62b043e4358 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:37 np0005539504 nova_compute[187152]: 2025-11-29 07:07:37.149 187156 DEBUG nova.compute.manager [req-8d02d542-6cf2-4bf6-848b-74d6287b72e5 req-1e839269-0645-40bf-8593-bf1821c97abe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-vif-unplugged-3d930a4f-6014-4f77-a512-d62b043e4358 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:07:38 np0005539504 podman[227329]: 2025-11-29 07:07:38.153420052 +0000 UTC m=+1.714517134 container remove 5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.162 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[93222871-fae8-4cf3-8aa8-a0963d7383a0]: (4, ('Sat Nov 29 07:07:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 (5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3)\n5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3\nSat Nov 29 07:07:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 (5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3)\n5581e552b85506e8d3abdb2a1f9fe8dea7c794f37d8486e0b4f39197a5fa26a3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.164 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f354cd1b-9c8d-43cc-b66e-b86c8726ccc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.165 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a3c675-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:38 np0005539504 nova_compute[187152]: 2025-11-29 07:07:38.217 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:38 np0005539504 kernel: tapc8a3c675-40: left promiscuous mode
Nov 29 02:07:38 np0005539504 nova_compute[187152]: 2025-11-29 07:07:38.230 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.234 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[386fadf3-8f06-45f3-b2ed-f5dc72e52ff1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.248 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[17e1de3f-1ee7-403c-8467-bbae111fe23a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.250 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f1155e05-4cd0-4ae6-a821-c302684e152d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.268 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3f30c469-c4c4-4c0c-a3b1-6d200aad2b0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550969, 'reachable_time': 20273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227345, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:38 np0005539504 systemd[1]: run-netns-ovnmeta\x2dc8a3c675\x2d42f5\x2d48a4\x2d83d7\x2d2d39dd3304b9.mount: Deactivated successfully.
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.272 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:07:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:38.273 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a5a92d-8bde-4669-8f4e-9e52c92b5577]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:39 np0005539504 nova_compute[187152]: 2025-11-29 07:07:39.137 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:39 np0005539504 nova_compute[187152]: 2025-11-29 07:07:39.946 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:40 np0005539504 nova_compute[187152]: 2025-11-29 07:07:40.086 187156 INFO nova.compute.manager [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Took 6.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:07:40 np0005539504 nova_compute[187152]: 2025-11-29 07:07:40.087 187156 DEBUG oslo.service.loopingcall [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:07:40 np0005539504 nova_compute[187152]: 2025-11-29 07:07:40.088 187156 DEBUG nova.compute.manager [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:07:40 np0005539504 nova_compute[187152]: 2025-11-29 07:07:40.088 187156 DEBUG nova.network.neutron [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:07:42 np0005539504 podman[227350]: 2025-11-29 07:07:42.737101473 +0000 UTC m=+0.067226210 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:07:44 np0005539504 nova_compute[187152]: 2025-11-29 07:07:44.140 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:44 np0005539504 podman[227369]: 2025-11-29 07:07:44.759271446 +0000 UTC m=+0.084111094 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:07:44 np0005539504 podman[227370]: 2025-11-29 07:07:44.761949998 +0000 UTC m=+0.079031027 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Nov 29 02:07:44 np0005539504 nova_compute[187152]: 2025-11-29 07:07:44.948 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.654 187156 DEBUG nova.compute.manager [req-13fc2951-b1b8-4cad-8ac7-2c5793b3983e req-35fd106f-7643-42e2-8c44-37a1f6665bbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.655 187156 DEBUG oslo_concurrency.lockutils [req-13fc2951-b1b8-4cad-8ac7-2c5793b3983e req-35fd106f-7643-42e2-8c44-37a1f6665bbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.655 187156 DEBUG oslo_concurrency.lockutils [req-13fc2951-b1b8-4cad-8ac7-2c5793b3983e req-35fd106f-7643-42e2-8c44-37a1f6665bbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.655 187156 DEBUG oslo_concurrency.lockutils [req-13fc2951-b1b8-4cad-8ac7-2c5793b3983e req-35fd106f-7643-42e2-8c44-37a1f6665bbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.656 187156 DEBUG nova.compute.manager [req-13fc2951-b1b8-4cad-8ac7-2c5793b3983e req-35fd106f-7643-42e2-8c44-37a1f6665bbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] No waiting events found dispatching network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.656 187156 WARNING nova.compute.manager [req-13fc2951-b1b8-4cad-8ac7-2c5793b3983e req-35fd106f-7643-42e2-8c44-37a1f6665bbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received unexpected event network-vif-plugged-3d930a4f-6014-4f77-a512-d62b043e4358 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:07:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:45.682 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:45 np0005539504 nova_compute[187152]: 2025-11-29 07:07:45.683 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:45.684 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:07:46 np0005539504 nova_compute[187152]: 2025-11-29 07:07:46.955 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:46 np0005539504 nova_compute[187152]: 2025-11-29 07:07:46.975 187156 DEBUG nova.network.neutron [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:46 np0005539504 nova_compute[187152]: 2025-11-29 07:07:46.998 187156 INFO nova.compute.manager [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Took 6.91 seconds to deallocate network for instance.#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.085 187156 DEBUG nova.compute.manager [req-20fc3e8e-c07a-475b-8b56-df639e65b6cf req-6592b66b-2570-462f-92e4-3ad97379963d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Received event network-vif-deleted-3d930a4f-6014-4f77-a512-d62b043e4358 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.087 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.087 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.162 187156 DEBUG nova.compute.provider_tree [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.179 187156 DEBUG nova.scheduler.client.report [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.203 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.240 187156 INFO nova.scheduler.client.report [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Deleted allocations for instance ec5471fe-7a56-409a-b585-d51f3a74cc38#033[00m
Nov 29 02:07:47 np0005539504 nova_compute[187152]: 2025-11-29 07:07:47.330 187156 DEBUG oslo_concurrency.lockutils [None req-8c4691ce-c731-4ad3-92c4-88e700e5bf42 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "ec5471fe-7a56-409a-b585-d51f3a74cc38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:48 np0005539504 nova_compute[187152]: 2025-11-29 07:07:48.893 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400053.8919222, ec5471fe-7a56-409a-b585-d51f3a74cc38 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:48 np0005539504 nova_compute[187152]: 2025-11-29 07:07:48.894 187156 INFO nova.compute.manager [-] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:07:48 np0005539504 nova_compute[187152]: 2025-11-29 07:07:48.920 187156 DEBUG nova.compute.manager [None req-9314fff9-b8fd-4335-83e8-48cf5731b618 - - - - - -] [instance: ec5471fe-7a56-409a-b585-d51f3a74cc38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:48 np0005539504 nova_compute[187152]: 2025-11-29 07:07:48.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:49 np0005539504 nova_compute[187152]: 2025-11-29 07:07:49.143 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539504 podman[227414]: 2025-11-29 07:07:49.714356115 +0000 UTC m=+0.057411976 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:07:49 np0005539504 podman[227415]: 2025-11-29 07:07:49.832481395 +0000 UTC m=+0.173170462 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 29 02:07:49 np0005539504 nova_compute[187152]: 2025-11-29 07:07:49.952 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:51 np0005539504 nova_compute[187152]: 2025-11-29 07:07:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:52 np0005539504 nova_compute[187152]: 2025-11-29 07:07:52.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:52 np0005539504 nova_compute[187152]: 2025-11-29 07:07:52.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:54 np0005539504 nova_compute[187152]: 2025-11-29 07:07:54.146 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:54 np0005539504 nova_compute[187152]: 2025-11-29 07:07:54.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:07:55.686 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:55 np0005539504 nova_compute[187152]: 2025-11-29 07:07:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:55 np0005539504 nova_compute[187152]: 2025-11-29 07:07:55.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:07:57 np0005539504 ovn_controller[95182]: 2025-11-29T07:07:57Z|00300|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.103 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.895 187156 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.896 187156 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.896 187156 DEBUG nova.network.neutron [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:07:57 np0005539504 nova_compute[187152]: 2025-11-29 07:07:57.965 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:58 np0005539504 podman[227466]: 2025-11-29 07:07:58.71047126 +0000 UTC m=+0.059135154 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:07:59 np0005539504 nova_compute[187152]: 2025-11-29 07:07:59.148 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:59 np0005539504 nova_compute[187152]: 2025-11-29 07:07:59.954 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:02 np0005539504 nova_compute[187152]: 2025-11-29 07:08:02.222 187156 DEBUG nova.network.neutron [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:02 np0005539504 nova_compute[187152]: 2025-11-29 07:08:02.288 187156 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:02 np0005539504 nova_compute[187152]: 2025-11-29 07:08:02.291 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:02 np0005539504 nova_compute[187152]: 2025-11-29 07:08:02.291 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:08:02 np0005539504 nova_compute[187152]: 2025-11-29 07:08:02.291 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.127 187156 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.128 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Creating file /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/ee4ea43da31a416db9f5c459d8eb9a59.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.128 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/ee4ea43da31a416db9f5c459d8eb9a59.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.603 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/ee4ea43da31a416db9f5c459d8eb9a59.tmp" returned: 1 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.604 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/ee4ea43da31a416db9f5c459d8eb9a59.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.604 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Creating directory /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.605 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:03 np0005539504 podman[227488]: 2025-11-29 07:08:03.751332908 +0000 UTC m=+0.073050326 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.837 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258" returned: 0 in 0.232s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:03 np0005539504 nova_compute[187152]: 2025-11-29 07:08:03.845 187156 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:08:04 np0005539504 nova_compute[187152]: 2025-11-29 07:08:04.180 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:04 np0005539504 nova_compute[187152]: 2025-11-29 07:08:04.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 kernel: tapb7078e73-f0 (unregistering): left promiscuous mode
Nov 29 02:08:06 np0005539504 NetworkManager[55210]: <info>  [1764400086.0319] device (tapb7078e73-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:08:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:08:06Z|00301|binding|INFO|Releasing lport b7078e73-f0e3-441a-843e-8920e38aec30 from this chassis (sb_readonly=0)
Nov 29 02:08:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:08:06Z|00302|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 down in Southbound
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:08:06Z|00303|binding|INFO|Removing iface tapb7078e73-f0 ovn-installed in OVS
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.054 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.056 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.057 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.057 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.059 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c10636d-f979-4d4f-aaa2-ff6e6d7cfd85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.060 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:08:06 np0005539504 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000042.scope: Deactivated successfully.
Nov 29 02:08:06 np0005539504 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000042.scope: Consumed 18.352s CPU time.
Nov 29 02:08:06 np0005539504 systemd-machined[153423]: Machine qemu-41-instance-00000042 terminated.
Nov 29 02:08:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [NOTICE]   (226511) : haproxy version is 2.8.14-c23fe91
Nov 29 02:08:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [NOTICE]   (226511) : path to executable is /usr/sbin/haproxy
Nov 29 02:08:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [WARNING]  (226511) : Exiting Master process...
Nov 29 02:08:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [WARNING]  (226511) : Exiting Master process...
Nov 29 02:08:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [ALERT]    (226511) : Current worker (226513) exited with code 143 (Terminated)
Nov 29 02:08:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226507]: [WARNING]  (226511) : All workers exited. Exiting... (0)
Nov 29 02:08:06 np0005539504 systemd[1]: libpod-32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695.scope: Deactivated successfully.
Nov 29 02:08:06 np0005539504 podman[227533]: 2025-11-29 07:08:06.212191938 +0000 UTC m=+0.049725429 container died 32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:08:06 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695-userdata-shm.mount: Deactivated successfully.
Nov 29 02:08:06 np0005539504 systemd[1]: var-lib-containers-storage-overlay-bc842b2649017d48f38e2a182ca538b2723b110f4a460a6ef353185e7fa6e04f-merged.mount: Deactivated successfully.
Nov 29 02:08:06 np0005539504 podman[227533]: 2025-11-29 07:08:06.253948892 +0000 UTC m=+0.091482383 container cleanup 32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:08:06 np0005539504 systemd[1]: libpod-conmon-32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695.scope: Deactivated successfully.
Nov 29 02:08:06 np0005539504 podman[227563]: 2025-11-29 07:08:06.345698411 +0000 UTC m=+0.064476626 container remove 32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.351 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3b5eb8-e91b-40e1-bada-08c573fd3e7e]: (4, ('Sat Nov 29 07:08:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695)\n32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695\nSat Nov 29 07:08:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695)\n32d1993912baa3b52cfe3f94d64e4edc4291cca6df9ec775da3f27e08f7a3695\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.352 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8da18126-9ea3-4cac-8c1d-83e9e1c65674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.354 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.395 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.413 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.416 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2bb226-f5ce-4548-bf46-47555c0f7f55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.429 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a199304e-81ef-436e-8cc2-44c1c249d59e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.430 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[16a72e69-6f41-4ef3-9418-4c7d0d544a54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.448 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc5b730-7b35-4545-b3e6-452fc422ce44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 544405, 'reachable_time': 21198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227596, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.450 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:08:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:06.450 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2a068c-4aae-4ac3-b580-e0bd56bdab11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:06 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.816 187156 DEBUG nova.compute.manager [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.816 187156 DEBUG oslo_concurrency.lockutils [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.817 187156 DEBUG oslo_concurrency.lockutils [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.817 187156 DEBUG oslo_concurrency.lockutils [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.817 187156 DEBUG nova.compute.manager [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.817 187156 WARNING nova.compute.manager [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.865 187156 INFO nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.873 187156 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance destroyed successfully.#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.874 187156 DEBUG nova.virt.libvirt.vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:07:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.875 187156 DEBUG nova.network.os_vif_util [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.876 187156 DEBUG nova.network.os_vif_util [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.877 187156 DEBUG os_vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.878 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.879 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7078e73-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.880 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.886 187156 INFO os_vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.891 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.964 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:06 np0005539504 nova_compute[187152]: 2025-11-29 07:08:06.966 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.026 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.028 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Copying file /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk to 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.028 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.549 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "scp -r /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.550 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Copying file /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.550 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk.config 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.683 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.720 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.720 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.721 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.721 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.747 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.748 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.748 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.749 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.772 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "scp -C -r /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk.config 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.773 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Copying file /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.773 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk.info 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.868 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000042, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk#033[00m
Nov 29 02:08:07 np0005539504 nova_compute[187152]: 2025-11-29 07:08:07.981 187156 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "scp -C -r /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_resize/disk.info 192.168.122.102:/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.info" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.055 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.057 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5739MB free_disk=73.16339492797852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.057 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.057 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.129 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating resource usage from migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.175 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.176 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.176 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.249 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.265 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.296 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.297 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:08 np0005539504 nova_compute[187152]: 2025-11-29 07:08:08.569 187156 DEBUG neutronclient.v2_0.client [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b7078e73-f0e3-441a-843e-8920e38aec30 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.292 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.820 187156 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.820 187156 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.821 187156 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.958 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.964 187156 DEBUG nova.compute.manager [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.964 187156 DEBUG oslo_concurrency.lockutils [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.964 187156 DEBUG oslo_concurrency.lockutils [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.965 187156 DEBUG oslo_concurrency.lockutils [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.965 187156 DEBUG nova.compute.manager [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:09 np0005539504 nova_compute[187152]: 2025-11-29 07:08:09.965 187156 WARNING nova.compute.manager [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:08:11 np0005539504 nova_compute[187152]: 2025-11-29 07:08:11.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:12 np0005539504 nova_compute[187152]: 2025-11-29 07:08:12.016 187156 DEBUG nova.compute.manager [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:12 np0005539504 nova_compute[187152]: 2025-11-29 07:08:12.017 187156 DEBUG nova.compute.manager [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing instance network info cache due to event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:08:12 np0005539504 nova_compute[187152]: 2025-11-29 07:08:12.017 187156 DEBUG oslo_concurrency.lockutils [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:12 np0005539504 nova_compute[187152]: 2025-11-29 07:08:12.017 187156 DEBUG oslo_concurrency.lockutils [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:12 np0005539504 nova_compute[187152]: 2025-11-29 07:08:12.017 187156 DEBUG nova.network.neutron [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:08:13 np0005539504 podman[227610]: 2025-11-29 07:08:13.709412484 +0000 UTC m=+0.052819083 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:08:13 np0005539504 nova_compute[187152]: 2025-11-29 07:08:13.809 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:14 np0005539504 nova_compute[187152]: 2025-11-29 07:08:14.959 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:15 np0005539504 nova_compute[187152]: 2025-11-29 07:08:15.293 187156 DEBUG nova.network.neutron [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updated VIF entry in instance network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:08:15 np0005539504 nova_compute[187152]: 2025-11-29 07:08:15.293 187156 DEBUG nova.network.neutron [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:15 np0005539504 nova_compute[187152]: 2025-11-29 07:08:15.510 187156 DEBUG oslo_concurrency.lockutils [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:15 np0005539504 podman[227629]: 2025-11-29 07:08:15.707748785 +0000 UTC m=+0.048699291 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:08:15 np0005539504 podman[227630]: 2025-11-29 07:08:15.729984944 +0000 UTC m=+0.063397757 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:08:16 np0005539504 nova_compute[187152]: 2025-11-29 07:08:16.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:18 np0005539504 nova_compute[187152]: 2025-11-29 07:08:18.300 187156 DEBUG nova.compute.manager [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:18 np0005539504 nova_compute[187152]: 2025-11-29 07:08:18.300 187156 DEBUG oslo_concurrency.lockutils [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:18 np0005539504 nova_compute[187152]: 2025-11-29 07:08:18.300 187156 DEBUG oslo_concurrency.lockutils [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:18 np0005539504 nova_compute[187152]: 2025-11-29 07:08:18.300 187156 DEBUG oslo_concurrency.lockutils [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:18 np0005539504 nova_compute[187152]: 2025-11-29 07:08:18.300 187156 DEBUG nova.compute.manager [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:18 np0005539504 nova_compute[187152]: 2025-11-29 07:08:18.301 187156 WARNING nova.compute.manager [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:08:19 np0005539504 nova_compute[187152]: 2025-11-29 07:08:19.961 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.558 187156 DEBUG nova.compute.manager [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.558 187156 DEBUG oslo_concurrency.lockutils [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.558 187156 DEBUG oslo_concurrency.lockutils [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.559 187156 DEBUG oslo_concurrency.lockutils [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.559 187156 DEBUG nova.compute.manager [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.559 187156 WARNING nova.compute.manager [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:08:20 np0005539504 podman[227672]: 2025-11-29 07:08:20.721254897 +0000 UTC m=+0.065042321 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:08:20 np0005539504 nova_compute[187152]: 2025-11-29 07:08:20.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:20 np0005539504 podman[227673]: 2025-11-29 07:08:20.779045662 +0000 UTC m=+0.119609209 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.329 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400086.3281767, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.330 187156 INFO nova.compute.manager [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.434 187156 DEBUG nova.compute.manager [None req-fa366e83-9b0f-457c-84bb-755c4898af85 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.440 187156 DEBUG nova.compute.manager [None req-fa366e83-9b0f-457c-84bb-755c4898af85 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.562 187156 INFO nova.compute.manager [None req-fa366e83-9b0f-457c-84bb-755c4898af85 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.925 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.942 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.943 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:21 np0005539504 nova_compute[187152]: 2025-11-29 07:08:21.943 187156 DEBUG nova.compute.manager [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 29 02:08:22 np0005539504 nova_compute[187152]: 2025-11-29 07:08:22.084 187156 DEBUG nova.objects.instance [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'info_cache' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:22.924 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:22.925 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:22.925 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:23 np0005539504 nova_compute[187152]: 2025-11-29 07:08:23.367 187156 DEBUG neutronclient.v2_0.client [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b7078e73-f0e3-441a-843e-8920e38aec30 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:08:23 np0005539504 nova_compute[187152]: 2025-11-29 07:08:23.369 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:23 np0005539504 nova_compute[187152]: 2025-11-29 07:08:23.369 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:23 np0005539504 nova_compute[187152]: 2025-11-29 07:08:23.370 187156 DEBUG nova.network.neutron [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:08:24 np0005539504 nova_compute[187152]: 2025-11-29 07:08:24.962 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:26 np0005539504 nova_compute[187152]: 2025-11-29 07:08:26.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.048 187156 DEBUG nova.network.neutron [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.248 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.249 187156 DEBUG nova.objects.instance [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.293 187156 DEBUG nova.virt.libvirt.vif [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:08:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:08:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.294 187156 DEBUG nova.network.os_vif_util [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.295 187156 DEBUG nova.network.os_vif_util [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.296 187156 DEBUG os_vif [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.300 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.301 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7078e73-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.301 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.307 187156 INFO os_vif [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.307 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.308 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.462 187156 DEBUG nova.compute.provider_tree [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.523 187156 DEBUG nova.scheduler.client.report [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:08:27 np0005539504 nova_compute[187152]: 2025-11-29 07:08:27.656 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:28 np0005539504 nova_compute[187152]: 2025-11-29 07:08:28.126 187156 INFO nova.scheduler.client.report [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Deleted allocation for migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b#033[00m
Nov 29 02:08:28 np0005539504 nova_compute[187152]: 2025-11-29 07:08:28.322 187156 DEBUG oslo_concurrency.lockutils [None req-63cb4673-41e6-4ff9-833e-5647b696a724 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:29 np0005539504 podman[227718]: 2025-11-29 07:08:29.719315455 +0000 UTC m=+0.066403828 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:08:29 np0005539504 nova_compute[187152]: 2025-11-29 07:08:29.964 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:31 np0005539504 nova_compute[187152]: 2025-11-29 07:08:31.929 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539504 podman[227736]: 2025-11-29 07:08:34.71545819 +0000 UTC m=+0.055802253 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:08:34 np0005539504 nova_compute[187152]: 2025-11-29 07:08:34.966 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:35 np0005539504 nova_compute[187152]: 2025-11-29 07:08:35.965 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:36 np0005539504 nova_compute[187152]: 2025-11-29 07:08:36.931 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:37 np0005539504 nova_compute[187152]: 2025-11-29 07:08:37.833 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539504 nova_compute[187152]: 2025-11-29 07:08:39.968 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:41 np0005539504 nova_compute[187152]: 2025-11-29 07:08:41.933 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:44 np0005539504 podman[227755]: 2025-11-29 07:08:44.722081003 +0000 UTC m=+0.059190634 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:08:44 np0005539504 nova_compute[187152]: 2025-11-29 07:08:44.971 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:46 np0005539504 podman[227774]: 2025-11-29 07:08:46.738332937 +0000 UTC m=+0.083218611 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:08:46 np0005539504 podman[227775]: 2025-11-29 07:08:46.738336827 +0000 UTC m=+0.079834690 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Nov 29 02:08:46 np0005539504 nova_compute[187152]: 2025-11-29 07:08:46.935 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:46 np0005539504 nova_compute[187152]: 2025-11-29 07:08:46.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.963 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.963 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.963 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.963 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:08:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:08:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:48.893 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:48 np0005539504 nova_compute[187152]: 2025-11-29 07:08:48.893 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:48.895 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.430 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.502 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.503 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.692 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.856 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.856 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.867 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.868 187156 INFO nova.compute.claims [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.972 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:49 np0005539504 nova_compute[187152]: 2025-11-29 07:08:49.990 187156 DEBUG nova.scheduler.client.report [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.011 187156 DEBUG nova.scheduler.client.report [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.012 187156 DEBUG nova.compute.provider_tree [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.027 187156 DEBUG nova.scheduler.client.report [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.054 187156 DEBUG nova.scheduler.client.report [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.110 187156 DEBUG nova.compute.provider_tree [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.126 187156 DEBUG nova.scheduler.client.report [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.159 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.159 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.228 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.256 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.256 187156 DEBUG nova.network.neutron [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.281 187156 INFO nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.304 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.667 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.668 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.668 187156 INFO nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Creating image(s)#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.669 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.669 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.670 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.682 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.747 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.748 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.749 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.762 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.788 187156 DEBUG nova.policy [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.841 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.842 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.886 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.887 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.888 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.949 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.949 187156 DEBUG nova.virt.disk.api [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:08:50 np0005539504 nova_compute[187152]: 2025-11-29 07:08:50.950 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.008 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.009 187156 DEBUG nova.virt.disk.api [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.009 187156 DEBUG nova.objects.instance [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.030 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.030 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Ensure instance console log exists: /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.031 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.031 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.031 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:51 np0005539504 podman[227831]: 2025-11-29 07:08:51.705161891 +0000 UTC m=+0.050335076 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:08:51 np0005539504 podman[227832]: 2025-11-29 07:08:51.758463585 +0000 UTC m=+0.103676971 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:08:51 np0005539504 nova_compute[187152]: 2025-11-29 07:08:51.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:53 np0005539504 nova_compute[187152]: 2025-11-29 07:08:53.576 187156 DEBUG nova.network.neutron [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Successfully created port: 18ad87ad-fee6-484b-81da-6889ed2a9af1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:08:53 np0005539504 nova_compute[187152]: 2025-11-29 07:08:53.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:53 np0005539504 nova_compute[187152]: 2025-11-29 07:08:53.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:54 np0005539504 nova_compute[187152]: 2025-11-29 07:08:54.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:54 np0005539504 nova_compute[187152]: 2025-11-29 07:08:54.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:55 np0005539504 nova_compute[187152]: 2025-11-29 07:08:55.929 187156 DEBUG nova.network.neutron [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Successfully updated port: 18ad87ad-fee6-484b-81da-6889ed2a9af1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:08:55 np0005539504 nova_compute[187152]: 2025-11-29 07:08:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:55 np0005539504 nova_compute[187152]: 2025-11-29 07:08:55.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.077 187156 DEBUG nova.compute.manager [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.078 187156 DEBUG nova.compute.manager [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing instance network info cache due to event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.078 187156 DEBUG oslo_concurrency.lockutils [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.079 187156 DEBUG oslo_concurrency.lockutils [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.079 187156 DEBUG nova.network.neutron [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.135 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.258 187156 DEBUG nova.network.neutron [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.707 187156 DEBUG nova.network.neutron [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.731 187156 DEBUG oslo_concurrency.lockutils [req-b38e1bf5-1c7d-49aa-8403-0d2b3f2b2224 req-49438330-27e3-4ebd-a8d9-aae15e2f7c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.731 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.732 187156 DEBUG nova.network.neutron [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.932 187156 DEBUG nova.network.neutron [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:08:56 np0005539504 nova_compute[187152]: 2025-11-29 07:08:56.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:08:57.897 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:57 np0005539504 nova_compute[187152]: 2025-11-29 07:08:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:57 np0005539504 nova_compute[187152]: 2025-11-29 07:08:57.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:08:57 np0005539504 nova_compute[187152]: 2025-11-29 07:08:57.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:08:57 np0005539504 nova_compute[187152]: 2025-11-29 07:08:57.970 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:08:57 np0005539504 nova_compute[187152]: 2025-11-29 07:08:57.970 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.201 187156 DEBUG nova.network.neutron [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.245 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.246 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance network_info: |[{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.249 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start _get_guest_xml network_info=[{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.254 187156 WARNING nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.259 187156 DEBUG nova.virt.libvirt.host [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.259 187156 DEBUG nova.virt.libvirt.host [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.263 187156 DEBUG nova.virt.libvirt.host [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.264 187156 DEBUG nova.virt.libvirt.host [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.265 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.265 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.265 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.265 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.266 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.266 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.266 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.266 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.267 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.267 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.267 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.267 187156 DEBUG nova.virt.hardware [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.271 187156 DEBUG nova.virt.libvirt.vif [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.271 187156 DEBUG nova.network.os_vif_util [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.272 187156 DEBUG nova.network.os_vif_util [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.272 187156 DEBUG nova.objects.instance [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.307 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <uuid>23cc8968-d9b9-42dc-b458-0683a72a0194</uuid>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <name>instance-00000057</name>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestJSON-server-1325280827</nova:name>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:08:58</nova:creationTime>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        <nova:port uuid="18ad87ad-fee6-484b-81da-6889ed2a9af1">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <entry name="serial">23cc8968-d9b9-42dc-b458-0683a72a0194</entry>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <entry name="uuid">23cc8968-d9b9-42dc-b458-0683a72a0194</entry>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:77:06:41"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <target dev="tap18ad87ad-fe"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/console.log" append="off"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:08:58 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:08:58 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:08:58 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:08:58 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.309 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Preparing to wait for external event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.310 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.310 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.310 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.311 187156 DEBUG nova.virt.libvirt.vif [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.311 187156 DEBUG nova.network.os_vif_util [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.312 187156 DEBUG nova.network.os_vif_util [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.312 187156 DEBUG os_vif [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.313 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.314 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.314 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.319 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.319 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18ad87ad-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.320 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18ad87ad-fe, col_values=(('external_ids', {'iface-id': '18ad87ad-fee6-484b-81da-6889ed2a9af1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:06:41', 'vm-uuid': '23cc8968-d9b9-42dc-b458-0683a72a0194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.322 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.323 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:58 np0005539504 NetworkManager[55210]: <info>  [1764400138.3243] manager: (tap18ad87ad-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.324 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.330 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.333 187156 INFO os_vif [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe')#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.452 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.452 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.453 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:77:06:41, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:08:58 np0005539504 nova_compute[187152]: 2025-11-29 07:08:58.454 187156 INFO nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Using config drive#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:08:59.999 187156 INFO nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Creating config drive at /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.009 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_8y7bgpu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.140 187156 DEBUG oslo_concurrency.processutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_8y7bgpu" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:00 np0005539504 NetworkManager[55210]: <info>  [1764400140.2188] manager: (tap18ad87ad-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Nov 29 02:09:00 np0005539504 kernel: tap18ad87ad-fe: entered promiscuous mode
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.221 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:00Z|00304|binding|INFO|Claiming lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 for this chassis.
Nov 29 02:09:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:00Z|00305|binding|INFO|18ad87ad-fee6-484b-81da-6889ed2a9af1: Claiming fa:16:3e:77:06:41 10.100.0.10
Nov 29 02:09:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:00Z|00306|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 ovn-installed in OVS
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.234 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.235 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.248 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539504 systemd-udevd[227920]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:09:00 np0005539504 systemd-machined[153423]: New machine qemu-44-instance-00000057.
Nov 29 02:09:00 np0005539504 NetworkManager[55210]: <info>  [1764400140.2750] device (tap18ad87ad-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:09:00 np0005539504 NetworkManager[55210]: <info>  [1764400140.2761] device (tap18ad87ad-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:09:00 np0005539504 systemd[1]: Started Virtual Machine qemu-44-instance-00000057.
Nov 29 02:09:00 np0005539504 podman[227895]: 2025-11-29 07:09:00.279195258 +0000 UTC m=+0.062966686 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:09:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:00Z|00307|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 up in Southbound
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.404 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:06:41 10.100.0.10'], port_security=['fa:16:3e:77:06:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '23cc8968-d9b9-42dc-b458-0683a72a0194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=18ad87ad-fee6-484b-81da-6889ed2a9af1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.407 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.410 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.426 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0c823094-59eb-4351-a2b9-9c24a28b06a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.427 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.430 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.430 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[73caf703-a884-48bf-a648-657b0f1ff041]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.432 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fd406051-5055-49ab-8053-3f8192d3c199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.446 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[395a16ee-f69f-4607-977f-c0c741e25162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.474 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[38b0f1dc-4137-48e1-ad60-e43be3545278]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.511 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ea807f98-9984-4991-9b4a-ba2709e9a457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.517 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[036fae9c-aff4-42cd-8ef3-51bfc1377c44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 NetworkManager[55210]: <info>  [1764400140.5189] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/146)
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.560 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3537a667-ec85-4907-9f9c-5dd2070016d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.564 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f81493-e05e-40f5-bbd1-aeffe93b59f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 NetworkManager[55210]: <info>  [1764400140.5890] device (tap9226dea3-60): carrier: link connected
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.600 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[97097267-d860-4411-8dd9-12657f993c9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.618 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e1bcf0-e855-4be8-bf0f-6198a970ce1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560346, 'reachable_time': 20657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227957, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.632 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[281b2bc5-99a5-4e01-a231-c44cc408d8d8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560346, 'tstamp': 560346}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227958, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.647 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a42d96-17e5-4c30-91d4-7d7efd262420]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560346, 'reachable_time': 20657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227959, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.682 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9d8286-4643-4fbd-8984-f04f5e4e1eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.753 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[55baf546-0ef8-455d-a3d1-4840bb4448fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.755 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.755 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.755 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:00 np0005539504 NetworkManager[55210]: <info>  [1764400140.7583] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Nov 29 02:09:00 np0005539504 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.763 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:00Z|00308|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.776 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.777 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.779 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c6afd558-4b3e-4821-941a-eeee3a0874ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.780 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:09:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:00.781 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:09:00 np0005539504 nova_compute[187152]: 2025-11-29 07:09:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.195 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400141.195078, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.196 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Started (Lifecycle Event)#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.201 187156 DEBUG nova.compute.manager [req-39ca8789-d27c-4051-b377-39240a3a3285 req-034f589c-0ccb-4827-bb43-abd880b16d3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.203 187156 DEBUG oslo_concurrency.lockutils [req-39ca8789-d27c-4051-b377-39240a3a3285 req-034f589c-0ccb-4827-bb43-abd880b16d3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.203 187156 DEBUG oslo_concurrency.lockutils [req-39ca8789-d27c-4051-b377-39240a3a3285 req-034f589c-0ccb-4827-bb43-abd880b16d3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.204 187156 DEBUG oslo_concurrency.lockutils [req-39ca8789-d27c-4051-b377-39240a3a3285 req-034f589c-0ccb-4827-bb43-abd880b16d3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.204 187156 DEBUG nova.compute.manager [req-39ca8789-d27c-4051-b377-39240a3a3285 req-034f589c-0ccb-4827-bb43-abd880b16d3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Processing event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.206 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:09:01 np0005539504 podman[227998]: 2025-11-29 07:09:01.207401199 +0000 UTC m=+0.066155242 container create 02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.212 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.217 187156 INFO nova.virt.libvirt.driver [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance spawned successfully.#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.218 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:09:01 np0005539504 systemd[1]: Started libpod-conmon-02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50.scope.
Nov 29 02:09:01 np0005539504 podman[227998]: 2025-11-29 07:09:01.173321721 +0000 UTC m=+0.032075784 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.269 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.276 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.281 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.282 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.282 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.283 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:01 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.283 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.283 187156 DEBUG nova.virt.libvirt.driver [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:01 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/051504b44567ee63e8fed7865c5246e1ec4183b5536d3864af7b1219f13d46c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:09:01 np0005539504 podman[227998]: 2025-11-29 07:09:01.31746017 +0000 UTC m=+0.176214243 container init 02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:09:01 np0005539504 podman[227998]: 2025-11-29 07:09:01.322686571 +0000 UTC m=+0.181440614 container start 02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.329 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.329 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400141.1953135, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.329 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:09:01 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [NOTICE]   (228018) : New worker (228020) forked
Nov 29 02:09:01 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [NOTICE]   (228018) : Loading success.
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.395 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.401 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400141.2095761, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.401 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.468 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.472 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.477 187156 INFO nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Took 10.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.477 187156 DEBUG nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.510 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.593 187156 INFO nova.compute.manager [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Took 11.78 seconds to build instance.#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.638 187156 DEBUG oslo_concurrency.lockutils [None req-c88908e0-8712-455f-9c37-8c030af1900e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.984 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.985 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.985 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:01 np0005539504 nova_compute[187152]: 2025-11-29 07:09:01.986 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.066 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.128 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.129 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.189 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.352 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.353 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5631MB free_disk=73.1922721862793GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.353 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.354 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.435 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 23cc8968-d9b9-42dc-b458-0683a72a0194 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.435 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.436 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.508 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.536 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.568 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:09:02 np0005539504 nova_compute[187152]: 2025-11-29 07:09:02.568 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:03 np0005539504 nova_compute[187152]: 2025-11-29 07:09:03.323 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:04 np0005539504 nova_compute[187152]: 2025-11-29 07:09:04.658 187156 DEBUG nova.compute.manager [req-22df4dd2-c0ce-4954-b73e-fb56432d6141 req-5bf20d4a-0ff1-48d9-a786-14590e9877ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:04 np0005539504 nova_compute[187152]: 2025-11-29 07:09:04.658 187156 DEBUG oslo_concurrency.lockutils [req-22df4dd2-c0ce-4954-b73e-fb56432d6141 req-5bf20d4a-0ff1-48d9-a786-14590e9877ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:04 np0005539504 nova_compute[187152]: 2025-11-29 07:09:04.659 187156 DEBUG oslo_concurrency.lockutils [req-22df4dd2-c0ce-4954-b73e-fb56432d6141 req-5bf20d4a-0ff1-48d9-a786-14590e9877ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:04 np0005539504 nova_compute[187152]: 2025-11-29 07:09:04.659 187156 DEBUG oslo_concurrency.lockutils [req-22df4dd2-c0ce-4954-b73e-fb56432d6141 req-5bf20d4a-0ff1-48d9-a786-14590e9877ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:04 np0005539504 nova_compute[187152]: 2025-11-29 07:09:04.659 187156 DEBUG nova.compute.manager [req-22df4dd2-c0ce-4954-b73e-fb56432d6141 req-5bf20d4a-0ff1-48d9-a786-14590e9877ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:04 np0005539504 nova_compute[187152]: 2025-11-29 07:09:04.659 187156 WARNING nova.compute.manager [req-22df4dd2-c0ce-4954-b73e-fb56432d6141 req-5bf20d4a-0ff1-48d9-a786-14590e9877ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:09:05 np0005539504 nova_compute[187152]: 2025-11-29 07:09:05.000 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:05 np0005539504 podman[228036]: 2025-11-29 07:09:05.754072315 +0000 UTC m=+0.084890206 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:09:08 np0005539504 nova_compute[187152]: 2025-11-29 07:09:08.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539504 nova_compute[187152]: 2025-11-29 07:09:08.633 187156 DEBUG nova.compute.manager [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:08 np0005539504 nova_compute[187152]: 2025-11-29 07:09:08.634 187156 DEBUG nova.compute.manager [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing instance network info cache due to event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:08 np0005539504 nova_compute[187152]: 2025-11-29 07:09:08.634 187156 DEBUG oslo_concurrency.lockutils [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:08 np0005539504 nova_compute[187152]: 2025-11-29 07:09:08.635 187156 DEBUG oslo_concurrency.lockutils [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:08 np0005539504 nova_compute[187152]: 2025-11-29 07:09:08.635 187156 DEBUG nova.network.neutron [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:09:10 np0005539504 nova_compute[187152]: 2025-11-29 07:09:10.000 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:11 np0005539504 nova_compute[187152]: 2025-11-29 07:09:11.124 187156 DEBUG nova.network.neutron [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updated VIF entry in instance network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:09:11 np0005539504 nova_compute[187152]: 2025-11-29 07:09:11.124 187156 DEBUG nova.network.neutron [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:11 np0005539504 nova_compute[187152]: 2025-11-29 07:09:11.170 187156 DEBUG oslo_concurrency.lockutils [req-9a96daef-d6bb-431b-879c-cd4e24c1c429 req-6d6dcd00-5fcc-4c35-8e70-98598e7856f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:13 np0005539504 nova_compute[187152]: 2025-11-29 07:09:13.328 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:15 np0005539504 nova_compute[187152]: 2025-11-29 07:09:15.003 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:15Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:06:41 10.100.0.10
Nov 29 02:09:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:15Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:06:41 10.100.0.10
Nov 29 02:09:15 np0005539504 podman[228073]: 2025-11-29 07:09:15.758551279 +0000 UTC m=+0.079816219 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 02:09:16 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:16Z|00309|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:09:16 np0005539504 nova_compute[187152]: 2025-11-29 07:09:16.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:17 np0005539504 podman[228092]: 2025-11-29 07:09:17.707890893 +0000 UTC m=+0.055291829 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:09:17 np0005539504 podman[228093]: 2025-11-29 07:09:17.717270785 +0000 UTC m=+0.061308301 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, name=ubi9-minimal)
Nov 29 02:09:18 np0005539504 nova_compute[187152]: 2025-11-29 07:09:18.332 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:20 np0005539504 nova_compute[187152]: 2025-11-29 07:09:20.005 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:22 np0005539504 podman[228135]: 2025-11-29 07:09:22.741422204 +0000 UTC m=+0.080802876 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:09:22 np0005539504 podman[228136]: 2025-11-29 07:09:22.749978674 +0000 UTC m=+0.086118759 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:09:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:22.926 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:22.927 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:22.928 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:23 np0005539504 nova_compute[187152]: 2025-11-29 07:09:23.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:25 np0005539504 nova_compute[187152]: 2025-11-29 07:09:25.008 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:26Z|00310|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:09:26 np0005539504 nova_compute[187152]: 2025-11-29 07:09:26.603 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:28 np0005539504 nova_compute[187152]: 2025-11-29 07:09:28.336 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:30 np0005539504 nova_compute[187152]: 2025-11-29 07:09:30.011 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:30 np0005539504 podman[228183]: 2025-11-29 07:09:30.722265546 +0000 UTC m=+0.064955580 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:09:33 np0005539504 nova_compute[187152]: 2025-11-29 07:09:33.338 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:34 np0005539504 nova_compute[187152]: 2025-11-29 07:09:34.411 187156 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:34 np0005539504 nova_compute[187152]: 2025-11-29 07:09:34.412 187156 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:34 np0005539504 nova_compute[187152]: 2025-11-29 07:09:34.412 187156 DEBUG nova.network.neutron [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:35 np0005539504 nova_compute[187152]: 2025-11-29 07:09:35.013 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:36 np0005539504 nova_compute[187152]: 2025-11-29 07:09:36.500 187156 DEBUG nova.network.neutron [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:36 np0005539504 nova_compute[187152]: 2025-11-29 07:09:36.501 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:36 np0005539504 nova_compute[187152]: 2025-11-29 07:09:36.519 187156 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:36 np0005539504 nova_compute[187152]: 2025-11-29 07:09:36.696 187156 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:09:36 np0005539504 nova_compute[187152]: 2025-11-29 07:09:36.696 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Creating file /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/a781446522bb4c0ca89eb31ebe3f7e93.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:09:36 np0005539504 nova_compute[187152]: 2025-11-29 07:09:36.697 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/a781446522bb4c0ca89eb31ebe3f7e93.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:36 np0005539504 podman[228204]: 2025-11-29 07:09:36.738822812 +0000 UTC m=+0.079961913 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:09:37 np0005539504 nova_compute[187152]: 2025-11-29 07:09:37.110 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/a781446522bb4c0ca89eb31ebe3f7e93.tmp" returned: 1 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:37 np0005539504 nova_compute[187152]: 2025-11-29 07:09:37.111 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/a781446522bb4c0ca89eb31ebe3f7e93.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:09:37 np0005539504 nova_compute[187152]: 2025-11-29 07:09:37.112 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Creating directory /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:09:37 np0005539504 nova_compute[187152]: 2025-11-29 07:09:37.113 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:37 np0005539504 nova_compute[187152]: 2025-11-29 07:09:37.317 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:37 np0005539504 nova_compute[187152]: 2025-11-29 07:09:37.322 187156 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:09:38 np0005539504 nova_compute[187152]: 2025-11-29 07:09:38.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:39 np0005539504 kernel: tap18ad87ad-fe (unregistering): left promiscuous mode
Nov 29 02:09:39 np0005539504 NetworkManager[55210]: <info>  [1764400179.5105] device (tap18ad87ad-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:09:39 np0005539504 nova_compute[187152]: 2025-11-29 07:09:39.518 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:39Z|00311|binding|INFO|Releasing lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 from this chassis (sb_readonly=0)
Nov 29 02:09:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:39Z|00312|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 down in Southbound
Nov 29 02:09:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:09:39Z|00313|binding|INFO|Removing iface tap18ad87ad-fe ovn-installed in OVS
Nov 29 02:09:39 np0005539504 nova_compute[187152]: 2025-11-29 07:09:39.521 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:39 np0005539504 nova_compute[187152]: 2025-11-29 07:09:39.540 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:39 np0005539504 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000057.scope: Deactivated successfully.
Nov 29 02:09:39 np0005539504 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000057.scope: Consumed 15.003s CPU time.
Nov 29 02:09:39 np0005539504 systemd-machined[153423]: Machine qemu-44-instance-00000057 terminated.
Nov 29 02:09:39 np0005539504 nova_compute[187152]: 2025-11-29 07:09:39.746 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:39 np0005539504 nova_compute[187152]: 2025-11-29 07:09:39.750 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.015 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.240 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:06:41 10.100.0.10'], port_security=['fa:16:3e:77:06:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '23cc8968-d9b9-42dc-b458-0683a72a0194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=18ad87ad-fee6-484b-81da-6889ed2a9af1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.241 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.243 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.245 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[487e5238-5a07-47d0-9485-f0fb929af9dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.246 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.341 187156 INFO nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.346 187156 INFO nova.virt.libvirt.driver [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance destroyed successfully.#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.348 187156 DEBUG nova.virt.libvirt.vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.349 187156 DEBUG nova.network.os_vif_util [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.350 187156 DEBUG nova.network.os_vif_util [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.350 187156 DEBUG os_vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.352 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.352 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18ad87ad-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.354 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.355 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.357 187156 INFO os_vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe')#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.361 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.419 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.420 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:40 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [NOTICE]   (228018) : haproxy version is 2.8.14-c23fe91
Nov 29 02:09:40 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [NOTICE]   (228018) : path to executable is /usr/sbin/haproxy
Nov 29 02:09:40 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [WARNING]  (228018) : Exiting Master process...
Nov 29 02:09:40 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [ALERT]    (228018) : Current worker (228020) exited with code 143 (Terminated)
Nov 29 02:09:40 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228014]: [WARNING]  (228018) : All workers exited. Exiting... (0)
Nov 29 02:09:40 np0005539504 systemd[1]: libpod-02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50.scope: Deactivated successfully.
Nov 29 02:09:40 np0005539504 podman[228267]: 2025-11-29 07:09:40.439064407 +0000 UTC m=+0.093498056 container died 02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:09:40 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50-userdata-shm.mount: Deactivated successfully.
Nov 29 02:09:40 np0005539504 systemd[1]: var-lib-containers-storage-overlay-051504b44567ee63e8fed7865c5246e1ec4183b5536d3864af7b1219f13d46c8-merged.mount: Deactivated successfully.
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.488 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.489 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Copying file /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk to 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.490 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:40 np0005539504 podman[228267]: 2025-11-29 07:09:40.490985725 +0000 UTC m=+0.145419374 container cleanup 02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:09:40 np0005539504 systemd[1]: libpod-conmon-02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50.scope: Deactivated successfully.
Nov 29 02:09:40 np0005539504 podman[228302]: 2025-11-29 07:09:40.556959151 +0000 UTC m=+0.044858128 container remove 02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.562 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[32deff5a-f26a-4abb-893f-8a5f70220649]: (4, ('Sat Nov 29 07:09:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50)\n02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50\nSat Nov 29 07:09:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50)\n02aa9c0bbabab70e90e3db7456508f33b97b27c999183ce5015630e8cd0acd50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.563 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[31a7589b-7d0a-43ce-9ac1-0a69257c500b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.565 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.566 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.579 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.583 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[583b4072-682b-48e9-aea8-05488f0dbf85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.598 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e66a1713-5d24-47b3-8920-96438ccd71b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.600 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cd375f-c1d6-4c86-8c2b-25a582eff3f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.617 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a93de109-8ddd-4eba-b3b2-a025ded52cbd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560338, 'reachable_time': 38847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228318, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.621 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:09:40 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:40.621 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[95540df0-7061-4718-8b01-d72887066ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:40 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.679 187156 DEBUG nova.compute.manager [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.679 187156 DEBUG oslo_concurrency.lockutils [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.680 187156 DEBUG oslo_concurrency.lockutils [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.680 187156 DEBUG oslo_concurrency.lockutils [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.680 187156 DEBUG nova.compute.manager [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:40 np0005539504 nova_compute[187152]: 2025-11-29 07:09:40.681 187156 WARNING nova.compute.manager [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:09:41 np0005539504 nova_compute[187152]: 2025-11-29 07:09:41.036 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "scp -r /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:41 np0005539504 nova_compute[187152]: 2025-11-29 07:09:41.037 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Copying file /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:09:41 np0005539504 nova_compute[187152]: 2025-11-29 07:09:41.037 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk.config 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:41 np0005539504 nova_compute[187152]: 2025-11-29 07:09:41.661 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "scp -C -r /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk.config 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config" returned: 0 in 0.624s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:41 np0005539504 nova_compute[187152]: 2025-11-29 07:09:41.663 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Copying file /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:09:41 np0005539504 nova_compute[187152]: 2025-11-29 07:09:41.663 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk.info 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.067 187156 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "scp -C -r /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_resize/disk.info 192.168.122.102:/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.304 187156 DEBUG neutronclient.v2_0.client [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.443 187156 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.444 187156 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.445 187156 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.802 187156 DEBUG nova.compute.manager [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.802 187156 DEBUG oslo_concurrency.lockutils [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.802 187156 DEBUG oslo_concurrency.lockutils [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.803 187156 DEBUG oslo_concurrency.lockutils [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.803 187156 DEBUG nova.compute.manager [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:42 np0005539504 nova_compute[187152]: 2025-11-29 07:09:42.803 187156 WARNING nova.compute.manager [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:09:43 np0005539504 nova_compute[187152]: 2025-11-29 07:09:43.517 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:44 np0005539504 nova_compute[187152]: 2025-11-29 07:09:44.056 187156 DEBUG nova.compute.manager [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:44 np0005539504 nova_compute[187152]: 2025-11-29 07:09:44.057 187156 DEBUG nova.compute.manager [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing instance network info cache due to event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:44 np0005539504 nova_compute[187152]: 2025-11-29 07:09:44.058 187156 DEBUG oslo_concurrency.lockutils [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:44 np0005539504 nova_compute[187152]: 2025-11-29 07:09:44.058 187156 DEBUG oslo_concurrency.lockutils [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:44 np0005539504 nova_compute[187152]: 2025-11-29 07:09:44.058 187156 DEBUG nova.network.neutron [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:09:45 np0005539504 nova_compute[187152]: 2025-11-29 07:09:45.017 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:45 np0005539504 nova_compute[187152]: 2025-11-29 07:09:45.354 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539504 podman[228323]: 2025-11-29 07:09:46.711542621 +0000 UTC m=+0.056627775 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.300 187156 DEBUG nova.network.neutron [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updated VIF entry in instance network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.301 187156 DEBUG nova.network.neutron [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.387 187156 DEBUG oslo_concurrency.lockutils [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.494 187156 DEBUG nova.compute.manager [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.494 187156 DEBUG oslo_concurrency.lockutils [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.495 187156 DEBUG oslo_concurrency.lockutils [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.495 187156 DEBUG oslo_concurrency.lockutils [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.495 187156 DEBUG nova.compute.manager [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.495 187156 WARNING nova.compute.manager [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:09:47 np0005539504 nova_compute[187152]: 2025-11-29 07:09:47.568 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:48 np0005539504 podman[228344]: 2025-11-29 07:09:48.715616858 +0000 UTC m=+0.061077135 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:09:48 np0005539504 podman[228345]: 2025-11-29 07:09:48.726368506 +0000 UTC m=+0.069640824 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.616 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.617 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.654 187156 DEBUG nova.compute.manager [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.655 187156 DEBUG oslo_concurrency.lockutils [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.655 187156 DEBUG oslo_concurrency.lockutils [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.656 187156 DEBUG oslo_concurrency.lockutils [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.656 187156 DEBUG nova.compute.manager [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.656 187156 WARNING nova.compute.manager [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.661 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.839 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.840 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.848 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:09:49 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.849 187156 INFO nova.compute.claims [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:09:50 np0005539504 nova_compute[187152]: 2025-11-29 07:09:49.999 187156 DEBUG nova.compute.provider_tree [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:50 np0005539504 nova_compute[187152]: 2025-11-29 07:09:50.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:50 np0005539504 nova_compute[187152]: 2025-11-29 07:09:50.056 187156 DEBUG nova.scheduler.client.report [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:50 np0005539504 nova_compute[187152]: 2025-11-29 07:09:50.356 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:50 np0005539504 nova_compute[187152]: 2025-11-29 07:09:50.473 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:50 np0005539504 nova_compute[187152]: 2025-11-29 07:09:50.474 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.621 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.622 187156 DEBUG nova.network.neutron [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:09:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:52.646 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:52.647 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.692 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.765 187156 INFO nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.791 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.978 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.979 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.980 187156 INFO nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Creating image(s)#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.980 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "/var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.981 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "/var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.981 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "/var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:52 np0005539504 nova_compute[187152]: 2025-11-29 07:09:52.996 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.052 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.054 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.055 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.076 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.130 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.132 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.166 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.167 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.167 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.227 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.228 187156 DEBUG nova.virt.disk.api [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Checking if we can resize image /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.229 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.290 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.290 187156 DEBUG nova.virt.disk.api [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Cannot resize image /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.291 187156 DEBUG nova.objects.instance [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lazy-loading 'migration_context' on Instance uuid 26775c5b-f8ee-4576-b5c1-49f7cefff38c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.615 187156 DEBUG nova.policy [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3503486e3f043388733bf6cbb6c4521', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd35b30df459c452a9c28cede8ac3666b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:09:53 np0005539504 podman[228402]: 2025-11-29 07:09:53.713632901 +0000 UTC m=+0.056405119 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:09:53 np0005539504 podman[228403]: 2025-11-29 07:09:53.744281846 +0000 UTC m=+0.084808003 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:09:53 np0005539504 nova_compute[187152]: 2025-11-29 07:09:53.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:54 np0005539504 nova_compute[187152]: 2025-11-29 07:09:54.787 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400179.7854366, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:54 np0005539504 nova_compute[187152]: 2025-11-29 07:09:54.788 187156 INFO nova.compute.manager [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:09:54 np0005539504 nova_compute[187152]: 2025-11-29 07:09:54.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.358 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.821 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.822 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Ensure instance console log exists: /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.823 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.824 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.824 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.827 187156 DEBUG nova.compute.manager [None req-61e2b363-0ee6-47bd-94d6-677d5cb9050b - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.831 187156 DEBUG nova.compute.manager [None req-61e2b363-0ee6-47bd-94d6-677d5cb9050b - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.854 187156 INFO nova.compute.manager [None req-61e2b363-0ee6-47bd-94d6-677d5cb9050b - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:55 np0005539504 nova_compute[187152]: 2025-11-29 07:09:55.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:09:56 np0005539504 nova_compute[187152]: 2025-11-29 07:09:56.868 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:09:57.649 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:57 np0005539504 nova_compute[187152]: 2025-11-29 07:09:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:57 np0005539504 nova_compute[187152]: 2025-11-29 07:09:57.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:09:57 np0005539504 nova_compute[187152]: 2025-11-29 07:09:57.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:09:57 np0005539504 nova_compute[187152]: 2025-11-29 07:09:57.991 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:09:57 np0005539504 nova_compute[187152]: 2025-11-29 07:09:57.991 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:09:58 np0005539504 nova_compute[187152]: 2025-11-29 07:09:58.100 187156 DEBUG nova.network.neutron [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Successfully created port: 0f21aa98-0955-47ce-8d57-57a53b967dbd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.249 187156 DEBUG nova.network.neutron [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Successfully updated port: 0f21aa98-0955-47ce-8d57-57a53b967dbd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.270 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "refresh_cache-26775c5b-f8ee-4576-b5c1-49f7cefff38c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.270 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquired lock "refresh_cache-26775c5b-f8ee-4576-b5c1-49f7cefff38c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.270 187156 DEBUG nova.network.neutron [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.367 187156 DEBUG nova.compute.manager [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-changed-0f21aa98-0955-47ce-8d57-57a53b967dbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.368 187156 DEBUG nova.compute.manager [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Refreshing instance network info cache due to event network-changed-0f21aa98-0955-47ce-8d57-57a53b967dbd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.368 187156 DEBUG oslo_concurrency.lockutils [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-26775c5b-f8ee-4576-b5c1-49f7cefff38c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.428 187156 DEBUG nova.network.neutron [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.777 187156 DEBUG nova.compute.manager [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.778 187156 DEBUG oslo_concurrency.lockutils [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.778 187156 DEBUG oslo_concurrency.lockutils [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.779 187156 DEBUG oslo_concurrency.lockutils [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.779 187156 DEBUG nova.compute.manager [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:59 np0005539504 nova_compute[187152]: 2025-11-29 07:09:59.779 187156 WARNING nova.compute.manager [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:10:00 np0005539504 nova_compute[187152]: 2025-11-29 07:10:00.023 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539504 nova_compute[187152]: 2025-11-29 07:10:00.364 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539504 nova_compute[187152]: 2025-11-29 07:10:00.579 187156 INFO nova.compute.manager [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Swapping old allocation on dict_keys(['1c526389-06f6-4ffd-8e90-a84c6c39f0bc']) held by migration 238f6481-057e-4ef4-9bf5-568723a7c569 for instance#033[00m
Nov 29 02:10:00 np0005539504 nova_compute[187152]: 2025-11-29 07:10:00.624 187156 DEBUG nova.scheduler.client.report [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Overwriting current allocation {'allocations': {'2d55ea77-8118-4f48-9bb5-d62d10fd53c0': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 59}}, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'consumer_generation': 1} on consumer 23cc8968-d9b9-42dc-b458-0683a72a0194 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Nov 29 02:10:00 np0005539504 nova_compute[187152]: 2025-11-29 07:10:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.008 187156 INFO nova.network.neutron [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating port 18ad87ad-fee6-484b-81da-6889ed2a9af1 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.170 187156 DEBUG nova.network.neutron [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Updating instance_info_cache with network_info: [{"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.206 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Releasing lock "refresh_cache-26775c5b-f8ee-4576-b5c1-49f7cefff38c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.207 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Instance network_info: |[{"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.207 187156 DEBUG oslo_concurrency.lockutils [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-26775c5b-f8ee-4576-b5c1-49f7cefff38c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.208 187156 DEBUG nova.network.neutron [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Refreshing network info cache for port 0f21aa98-0955-47ce-8d57-57a53b967dbd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.212 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Start _get_guest_xml network_info=[{"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.219 187156 WARNING nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.226 187156 DEBUG nova.virt.libvirt.host [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.227 187156 DEBUG nova.virt.libvirt.host [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.235 187156 DEBUG nova.virt.libvirt.host [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.235 187156 DEBUG nova.virt.libvirt.host [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.236 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.237 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.237 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.237 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.237 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.238 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.238 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.238 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.238 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.238 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.239 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.239 187156 DEBUG nova.virt.hardware [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.242 187156 DEBUG nova.virt.libvirt.vif [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:09:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1290812695',display_name='tempest-ImagesOneServerTestJSON-server-1290812695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1290812695',id=92,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d35b30df459c452a9c28cede8ac3666b',ramdisk_id='',reservation_id='r-n212rgr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1137583721',owner_user_name='tempest-ImagesOneServerTestJSON-1137583721-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:52Z,user_data=None,user_id='b3503486e3f043388733bf6cbb6c4521',uuid=26775c5b-f8ee-4576-b5c1-49f7cefff38c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.242 187156 DEBUG nova.network.os_vif_util [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Converting VIF {"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.243 187156 DEBUG nova.network.os_vif_util [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.244 187156 DEBUG nova.objects.instance [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lazy-loading 'pci_devices' on Instance uuid 26775c5b-f8ee-4576-b5c1-49f7cefff38c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.259 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <uuid>26775c5b-f8ee-4576-b5c1-49f7cefff38c</uuid>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <name>instance-0000005c</name>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:name>tempest-ImagesOneServerTestJSON-server-1290812695</nova:name>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:10:01</nova:creationTime>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:user uuid="b3503486e3f043388733bf6cbb6c4521">tempest-ImagesOneServerTestJSON-1137583721-project-member</nova:user>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:project uuid="d35b30df459c452a9c28cede8ac3666b">tempest-ImagesOneServerTestJSON-1137583721</nova:project>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        <nova:port uuid="0f21aa98-0955-47ce-8d57-57a53b967dbd">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <entry name="serial">26775c5b-f8ee-4576-b5c1-49f7cefff38c</entry>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <entry name="uuid">26775c5b-f8ee-4576-b5c1-49f7cefff38c</entry>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.config"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:18:ae:b2"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <target dev="tap0f21aa98-09"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/console.log" append="off"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:10:01 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:10:01 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:10:01 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:10:01 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.260 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Preparing to wait for external event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.261 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.261 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.261 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.262 187156 DEBUG nova.virt.libvirt.vif [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:09:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1290812695',display_name='tempest-ImagesOneServerTestJSON-server-1290812695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1290812695',id=92,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d35b30df459c452a9c28cede8ac3666b',ramdisk_id='',reservation_id='r-n212rgr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1137583721',owner_user_name='tempest-ImagesOneServerTestJSON-1137583721-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:52Z,user_data=None,user_id='b3503486e3f043388733bf6cbb6c4521',uuid=26775c5b-f8ee-4576-b5c1-49f7cefff38c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.262 187156 DEBUG nova.network.os_vif_util [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Converting VIF {"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.263 187156 DEBUG nova.network.os_vif_util [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.263 187156 DEBUG os_vif [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.264 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.264 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.264 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.270 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f21aa98-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.270 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0f21aa98-09, col_values=(('external_ids', {'iface-id': '0f21aa98-0955-47ce-8d57-57a53b967dbd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:ae:b2', 'vm-uuid': '26775c5b-f8ee-4576-b5c1-49f7cefff38c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.272 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:01 np0005539504 NetworkManager[55210]: <info>  [1764400201.2730] manager: (tap0f21aa98-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.277 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.278 187156 INFO os_vif [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09')#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.348 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.348 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.348 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] No VIF found with MAC fa:16:3e:18:ae:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.349 187156 INFO nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Using config drive#033[00m
Nov 29 02:10:01 np0005539504 podman[228456]: 2025-11-29 07:10:01.380592365 +0000 UTC m=+0.068554476 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.793 187156 INFO nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Creating config drive at /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.config#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.800 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1tow_2gp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:01 np0005539504 nova_compute[187152]: 2025-11-29 07:10:01.926 187156 DEBUG oslo_concurrency.processutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1tow_2gp" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:02 np0005539504 kernel: tap0f21aa98-09: entered promiscuous mode
Nov 29 02:10:02 np0005539504 NetworkManager[55210]: <info>  [1764400202.0176] manager: (tap0f21aa98-09): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:02Z|00314|binding|INFO|Claiming lport 0f21aa98-0955-47ce-8d57-57a53b967dbd for this chassis.
Nov 29 02:10:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:02Z|00315|binding|INFO|0f21aa98-0955-47ce-8d57-57a53b967dbd: Claiming fa:16:3e:18:ae:b2 10.100.0.8
Nov 29 02:10:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:02Z|00316|binding|INFO|Setting lport 0f21aa98-0955-47ce-8d57-57a53b967dbd ovn-installed in OVS
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.032 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.034 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:02 np0005539504 systemd-udevd[228492]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:10:02 np0005539504 NetworkManager[55210]: <info>  [1764400202.0577] device (tap0f21aa98-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:10:02 np0005539504 NetworkManager[55210]: <info>  [1764400202.0585] device (tap0f21aa98-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:10:02 np0005539504 systemd-machined[153423]: New machine qemu-45-instance-0000005c.
Nov 29 02:10:02 np0005539504 systemd[1]: Started Virtual Machine qemu-45-instance-0000005c.
Nov 29 02:10:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:02Z|00317|binding|INFO|Setting lport 0f21aa98-0955-47ce-8d57-57a53b967dbd up in Southbound
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.367 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:ae:b2 10.100.0.8'], port_security=['fa:16:3e:18:ae:b2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '26775c5b-f8ee-4576-b5c1-49f7cefff38c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31c17588-ffc8-47d5-a31a-8aad693c067c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd35b30df459c452a9c28cede8ac3666b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9306e0a1-6a16-462f-82a5-9784e972844b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c650f9d7-64cc-4240-ae67-25d102dd42ea, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=0f21aa98-0955-47ce-8d57-57a53b967dbd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.370 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 0f21aa98-0955-47ce-8d57-57a53b967dbd in datapath 31c17588-ffc8-47d5-a31a-8aad693c067c bound to our chassis#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.371 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31c17588-ffc8-47d5-a31a-8aad693c067c#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.376 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400202.3760471, 26775c5b-f8ee-4576-b5c1-49f7cefff38c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.376 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.387 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[12b6e602-396c-4d36-ac3a-0ea3ce88c54a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.388 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap31c17588-f1 in ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.390 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap31c17588-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.390 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eb8e3d35-7ea5-4961-9a13-14b62e62a4df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.391 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8d1add-c9c2-4d88-932f-48feeb82c9cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.404 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[aee2914f-69db-4ba2-9829-e742caf29c7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.417 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf5e4d5-2237-4316-be6f-4b9cf28db4dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.445 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[59ed6987-63ce-4780-8357-8390f392f956]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 systemd-udevd[228495]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:10:02 np0005539504 NetworkManager[55210]: <info>  [1764400202.4559] manager: (tap31c17588-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.455 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f30cf3d5-dd04-4a46-88bb-a12b36d06a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.457 187156 DEBUG nova.compute.manager [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.458 187156 DEBUG oslo_concurrency.lockutils [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.458 187156 DEBUG oslo_concurrency.lockutils [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.459 187156 DEBUG oslo_concurrency.lockutils [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.459 187156 DEBUG nova.compute.manager [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.459 187156 WARNING nova.compute.manager [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.488 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddbc511-baf3-4267-91f5-745c11084e4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.492 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[9e480f64-8253-417c-8c93-4854f2e2d8b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.495 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.495 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.496 187156 DEBUG nova.network.neutron [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.501 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.505 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400202.3787298, 26775c5b-f8ee-4576-b5c1-49f7cefff38c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.506 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:10:02 np0005539504 NetworkManager[55210]: <info>  [1764400202.5169] device (tap31c17588-f0): carrier: link connected
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.523 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea7311f-e00e-4a61-8f51-272a30bd572d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.535 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.538 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.541 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d6633fda-dbfa-497b-ab85-b2762fe12bfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31c17588-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:43:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566539, 'reachable_time': 18731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228533, 'error': None, 'target': 'ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.558 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f235ddc1-9102-4ff0-8e3f-db42cc54dc6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:43ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 566539, 'tstamp': 566539}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228534, 'error': None, 'target': 'ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.582 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2e105b9c-f7c4-4854-8071-55e42ec4853f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31c17588-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:43:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566539, 'reachable_time': 18731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228535, 'error': None, 'target': 'ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.597 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.617 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bffabf7d-5c06-4255-ae83-18676aba1ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.668 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[813a7915-90e1-4d01-a49f-6e8ff0ae3427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.670 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31c17588-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.670 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.670 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31c17588-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.675 187156 DEBUG nova.compute.manager [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.676 187156 DEBUG nova.compute.manager [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing instance network info cache due to event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.676 187156 DEBUG oslo_concurrency.lockutils [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:02 np0005539504 NetworkManager[55210]: <info>  [1764400202.7049] manager: (tap31c17588-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Nov 29 02:10:02 np0005539504 kernel: tap31c17588-f0: entered promiscuous mode
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.707 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31c17588-f0, col_values=(('external_ids', {'iface-id': '23ed72a2-c029-4964-a378-5eb188f1396a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:02Z|00318|binding|INFO|Releasing lport 23ed72a2-c029-4964-a378-5eb188f1396a from this chassis (sb_readonly=0)
Nov 29 02:10:02 np0005539504 nova_compute[187152]: 2025-11-29 07:10:02.729 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.730 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/31c17588-ffc8-47d5-a31a-8aad693c067c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/31c17588-ffc8-47d5-a31a-8aad693c067c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.731 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[07f48722-c499-4e8e-95bd-31a63b66705d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.732 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-31c17588-ffc8-47d5-a31a-8aad693c067c
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/31c17588-ffc8-47d5-a31a-8aad693c067c.pid.haproxy
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 31c17588-ffc8-47d5-a31a-8aad693c067c
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:10:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:02.732 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c', 'env', 'PROCESS_TAG=haproxy-31c17588-ffc8-47d5-a31a-8aad693c067c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/31c17588-ffc8-47d5-a31a-8aad693c067c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.014 187156 DEBUG nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.015 187156 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.015 187156 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.015 187156 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.015 187156 DEBUG nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Processing event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.016 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.020 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400203.0199065, 26775c5b-f8ee-4576-b5c1-49f7cefff38c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.020 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.022 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.026 187156 INFO nova.virt.libvirt.driver [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Instance spawned successfully.#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.028 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.071 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.076 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.077 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.078 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.079 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.080 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.081 187156 DEBUG nova.virt.libvirt.driver [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.092 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:03 np0005539504 podman[228567]: 2025-11-29 07:10:03.098614793 +0000 UTC m=+0.030765539 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.209 187156 DEBUG nova.network.neutron [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Updated VIF entry in instance network info cache for port 0f21aa98-0955-47ce-8d57-57a53b967dbd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.210 187156 DEBUG nova.network.neutron [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Updating instance_info_cache with network_info: [{"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.298 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.303 187156 DEBUG oslo_concurrency.lockutils [req-8a9ddf96-abf1-4302-8cb9-f2553778a3f1 req-a0e38764-05c1-469c-afd0-f0420256f5d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-26775c5b-f8ee-4576-b5c1-49f7cefff38c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:03 np0005539504 podman[228567]: 2025-11-29 07:10:03.323316721 +0000 UTC m=+0.255467417 container create 22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:10:03 np0005539504 systemd[1]: Started libpod-conmon-22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b.scope.
Nov 29 02:10:03 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:10:03 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e401dbe0f2ff0b8162cc3f4f85743f1eaf2fe68f2405e93b5b64e427056ded8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:10:03 np0005539504 podman[228567]: 2025-11-29 07:10:03.610031637 +0000 UTC m=+0.542182433 container init 22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:03 np0005539504 podman[228567]: 2025-11-29 07:10:03.621748053 +0000 UTC m=+0.553898779 container start 22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.648 187156 INFO nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Took 10.67 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.648 187156 DEBUG nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:03 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [NOTICE]   (228586) : New worker (228588) forked
Nov 29 02:10:03 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [NOTICE]   (228586) : Loading success.
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.989 187156 INFO nova.compute.manager [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Took 14.23 seconds to build instance.#033[00m
Nov 29 02:10:03 np0005539504 nova_compute[187152]: 2025-11-29 07:10:03.999 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.044 187156 DEBUG oslo_concurrency.lockutils [None req-a8ab349e-668c-4ddc-82c1-59afb65fbe70 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.427s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.050 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.051 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.051 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.051 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.138 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000057, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.142 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.199 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.201 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.266 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.435 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.436 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5620MB free_disk=73.16341018676758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.436 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.437 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.540 187156 DEBUG nova.network.neutron [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.544 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 23cc8968-d9b9-42dc-b458-0683a72a0194 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.544 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 26775c5b-f8ee-4576-b5c1-49f7cefff38c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.544 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.545 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.575 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.576 187156 DEBUG nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.578 187156 DEBUG oslo_concurrency.lockutils [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.578 187156 DEBUG nova.network.neutron [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.587 187156 DEBUG nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start _get_guest_xml network_info=[{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.591 187156 WARNING nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.604 187156 DEBUG nova.virt.libvirt.host [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.605 187156 DEBUG nova.virt.libvirt.host [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.608 187156 DEBUG nova.virt.libvirt.host [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.609 187156 DEBUG nova.virt.libvirt.host [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.610 187156 DEBUG nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.611 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.611 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.612 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.612 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.612 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.612 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.613 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.613 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.613 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.614 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.614 187156 DEBUG nova.virt.hardware [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.614 187156 DEBUG nova.objects.instance [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.625 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.632 187156 DEBUG oslo_concurrency.processutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.651 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.688 187156 DEBUG oslo_concurrency.processutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.690 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.690 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.691 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.693 187156 DEBUG nova.virt.libvirt.vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.693 187156 DEBUG nova.network.os_vif_util [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.694 187156 DEBUG nova.network.os_vif_util [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.697 187156 DEBUG nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <uuid>23cc8968-d9b9-42dc-b458-0683a72a0194</uuid>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <name>instance-00000057</name>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestJSON-server-1325280827</nova:name>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:10:04</nova:creationTime>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        <nova:port uuid="18ad87ad-fee6-484b-81da-6889ed2a9af1">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <entry name="serial">23cc8968-d9b9-42dc-b458-0683a72a0194</entry>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <entry name="uuid">23cc8968-d9b9-42dc-b458-0683a72a0194</entry>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:77:06:41"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <target dev="tap18ad87ad-fe"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/console.log" append="off"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:10:04 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:10:04 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:10:04 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:10:04 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.708 187156 DEBUG nova.compute.manager [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Preparing to wait for external event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.708 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.708 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.709 187156 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.709 187156 DEBUG nova.virt.libvirt.vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.710 187156 DEBUG nova.network.os_vif_util [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.710 187156 DEBUG nova.network.os_vif_util [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.711 187156 DEBUG os_vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.711 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.712 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.713 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.716 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.717 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18ad87ad-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.717 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18ad87ad-fe, col_values=(('external_ids', {'iface-id': '18ad87ad-fee6-484b-81da-6889ed2a9af1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:06:41', 'vm-uuid': '23cc8968-d9b9-42dc-b458-0683a72a0194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.719 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:04 np0005539504 NetworkManager[55210]: <info>  [1764400204.7201] manager: (tap18ad87ad-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.722 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.726 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:04 np0005539504 nova_compute[187152]: 2025-11-29 07:10:04.727 187156 INFO os_vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe')#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.024 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 kernel: tap18ad87ad-fe: entered promiscuous mode
Nov 29 02:10:05 np0005539504 NetworkManager[55210]: <info>  [1764400205.1770] manager: (tap18ad87ad-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/153)
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.178 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:05Z|00319|binding|INFO|Claiming lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 for this chassis.
Nov 29 02:10:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:05Z|00320|binding|INFO|18ad87ad-fee6-484b-81da-6889ed2a9af1: Claiming fa:16:3e:77:06:41 10.100.0.10
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.194 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:05Z|00321|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 ovn-installed in OVS
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.198 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:05Z|00322|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 up in Southbound
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.208 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:06:41 10.100.0.10'], port_security=['fa:16:3e:77:06:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '23cc8968-d9b9-42dc-b458-0683a72a0194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '10', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=18ad87ad-fee6-484b-81da-6889ed2a9af1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.210 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.212 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:10:05 np0005539504 systemd-udevd[228621]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.215 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.215 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:05 np0005539504 systemd-machined[153423]: New machine qemu-46-instance-00000057.
Nov 29 02:10:05 np0005539504 NetworkManager[55210]: <info>  [1764400205.2345] device (tap18ad87ad-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:10:05 np0005539504 NetworkManager[55210]: <info>  [1764400205.2356] device (tap18ad87ad-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.236 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[857a2c4c-305d-4638-bfef-f2e0da435df6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.237 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.239 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.240 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9d0ca590-0657-4268-a4ab-965b2a5659af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.241 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a55b440a-c31f-4039-a314-6c77a70e9512]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 systemd[1]: Started Virtual Machine qemu-46-instance-00000057.
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.260 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[db6cd8dd-e949-490a-88d7-0a279a51bf98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.291 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[112a7bfd-b001-4f39-8e14-722f6b62af40]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.322 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe1edd4-1012-4672-a910-c5404be1c7a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 NetworkManager[55210]: <info>  [1764400205.3298] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/154)
Nov 29 02:10:05 np0005539504 systemd-udevd[228626]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.330 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cdabad71-9a13-4833-923b-a32aa98abd98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.376 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7a008fd9-e577-4cc1-b7f1-fc1449d604ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.381 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1ace03c1-df6c-4443-89fb-d5dc65d618d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 NetworkManager[55210]: <info>  [1764400205.4239] device (tap9226dea3-60): carrier: link connected
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.433 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8a264d5e-0378-46bb-b2d9-e49142be2b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.453 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8abfae-16a1-4e5a-adf4-828485f09284]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566830, 'reachable_time': 39402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228661, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.471 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a149b4da-7815-4940-a221-380d8b277021]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 566830, 'tstamp': 566830}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228663, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.488 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[20cc3daa-d9e6-4027-8d89-e324845c6473]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566830, 'reachable_time': 39402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228664, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.521 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[18d375ad-339f-445f-8046-d3b898f8cfb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.530 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400205.5295122, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.530 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Started (Lifecycle Event)#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.548 187156 DEBUG nova.compute.manager [req-fb6da061-80fe-404c-9856-01b92053cc82 req-023dfee5-4521-4967-9bc8-83581128c08e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.549 187156 DEBUG oslo_concurrency.lockutils [req-fb6da061-80fe-404c-9856-01b92053cc82 req-023dfee5-4521-4967-9bc8-83581128c08e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.549 187156 DEBUG oslo_concurrency.lockutils [req-fb6da061-80fe-404c-9856-01b92053cc82 req-023dfee5-4521-4967-9bc8-83581128c08e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.549 187156 DEBUG oslo_concurrency.lockutils [req-fb6da061-80fe-404c-9856-01b92053cc82 req-023dfee5-4521-4967-9bc8-83581128c08e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.550 187156 DEBUG nova.compute.manager [req-fb6da061-80fe-404c-9856-01b92053cc82 req-023dfee5-4521-4967-9bc8-83581128c08e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] No waiting events found dispatching network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.550 187156 WARNING nova.compute.manager [req-fb6da061-80fe-404c-9856-01b92053cc82 req-023dfee5-4521-4967-9bc8-83581128c08e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received unexpected event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd for instance with vm_state active and task_state image_snapshot_pending.#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.580 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9affa86f-f51b-463e-8f89-e170b5601215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.582 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.583 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.583 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:05 np0005539504 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:10:05 np0005539504 NetworkManager[55210]: <info>  [1764400205.5864] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.585 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.589 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:05Z|00323|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.590 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.592 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.592 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.593 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[19d4c34c-e6aa-4aa4-b849-6b3d881d8541]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.594 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:10:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:05.594 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.607 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.763 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.769 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400205.5297294, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.770 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.821 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.826 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:05 np0005539504 nova_compute[187152]: 2025-11-29 07:10:05.871 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 29 02:10:06 np0005539504 podman[228697]: 2025-11-29 07:10:06.011731826 +0000 UTC m=+0.032720123 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.187 187156 DEBUG nova.compute.manager [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.189 187156 DEBUG oslo_concurrency.lockutils [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.189 187156 DEBUG oslo_concurrency.lockutils [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.190 187156 DEBUG oslo_concurrency.lockutils [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.190 187156 DEBUG nova.compute.manager [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Processing event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.191 187156 DEBUG nova.compute.manager [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.197 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400206.197138, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.197 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.208 187156 INFO nova.virt.libvirt.driver [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance running successfully.#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.209 187156 DEBUG nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.461 187156 DEBUG nova.compute.manager [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:06 np0005539504 podman[228697]: 2025-11-29 07:10:06.540604369 +0000 UTC m=+0.561592626 container create eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.560 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.563 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:06 np0005539504 systemd[1]: Started libpod-conmon-eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93.scope.
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.623 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 29 02:10:06 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:10:06 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a991c92593107c6a8f63f049556dabe68ebdb0a20165a87c03a79ac948ec9ba8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:10:06 np0005539504 podman[228697]: 2025-11-29 07:10:06.666752765 +0000 UTC m=+0.687741042 container init eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:10:06 np0005539504 podman[228697]: 2025-11-29 07:10:06.672771526 +0000 UTC m=+0.693759783 container start eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:10:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [NOTICE]   (228717) : New worker (228719) forked
Nov 29 02:10:06 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [NOTICE]   (228717) : Loading success.
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.740 187156 INFO nova.compute.manager [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance to original state: 'active'#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.744 187156 INFO nova.compute.manager [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] instance snapshotting#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.974 187156 DEBUG nova.network.neutron [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updated VIF entry in instance network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:10:06 np0005539504 nova_compute[187152]: 2025-11-29 07:10:06.974 187156 DEBUG nova.network.neutron [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:07 np0005539504 nova_compute[187152]: 2025-11-29 07:10:07.156 187156 INFO nova.virt.libvirt.driver [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Beginning live snapshot process#033[00m
Nov 29 02:10:07 np0005539504 nova_compute[187152]: 2025-11-29 07:10:07.272 187156 DEBUG oslo_concurrency.lockutils [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:07 np0005539504 podman[228729]: 2025-11-29 07:10:07.714504913 +0000 UTC m=+0.055930436 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:10:08 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.457 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.551 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.553 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.609 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.622 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.683 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:08 np0005539504 nova_compute[187152]: 2025-11-29 07:10:08.684 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp_6cp4d8c/ae72f1b8bf8e49ab9a3a0e45c0963d71.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.036 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp_6cp4d8c/ae72f1b8bf8e49ab9a3a0e45c0963d71.delta 1073741824" returned: 0 in 0.352s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.038 187156 INFO nova.virt.libvirt.driver [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.047 187156 DEBUG nova.compute.manager [req-49414202-9910-41d1-a0ff-44717f7032b0 req-02e5834a-4dae-4ffa-ba36-d239a7930a5f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.048 187156 DEBUG oslo_concurrency.lockutils [req-49414202-9910-41d1-a0ff-44717f7032b0 req-02e5834a-4dae-4ffa-ba36-d239a7930a5f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.049 187156 DEBUG oslo_concurrency.lockutils [req-49414202-9910-41d1-a0ff-44717f7032b0 req-02e5834a-4dae-4ffa-ba36-d239a7930a5f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.049 187156 DEBUG oslo_concurrency.lockutils [req-49414202-9910-41d1-a0ff-44717f7032b0 req-02e5834a-4dae-4ffa-ba36-d239a7930a5f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.049 187156 DEBUG nova.compute.manager [req-49414202-9910-41d1-a0ff-44717f7032b0 req-02e5834a-4dae-4ffa-ba36-d239a7930a5f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.050 187156 WARNING nova.compute.manager [req-49414202-9910-41d1-a0ff-44717f7032b0 req-02e5834a-4dae-4ffa-ba36-d239a7930a5f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.112 187156 DEBUG nova.virt.libvirt.guest [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.117 187156 INFO nova.virt.libvirt.driver [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.246 187156 DEBUG nova.privsep.utils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.246 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp_6cp4d8c/ae72f1b8bf8e49ab9a3a0e45c0963d71.delta /var/lib/nova/instances/snapshots/tmp_6cp4d8c/ae72f1b8bf8e49ab9a3a0e45c0963d71 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:09 np0005539504 nova_compute[187152]: 2025-11-29 07:10:09.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:10 np0005539504 nova_compute[187152]: 2025-11-29 07:10:10.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:10 np0005539504 nova_compute[187152]: 2025-11-29 07:10:10.440 187156 DEBUG oslo_concurrency.processutils [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp_6cp4d8c/ae72f1b8bf8e49ab9a3a0e45c0963d71.delta /var/lib/nova/instances/snapshots/tmp_6cp4d8c/ae72f1b8bf8e49ab9a3a0e45c0963d71" returned: 0 in 1.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:10 np0005539504 nova_compute[187152]: 2025-11-29 07:10:10.441 187156 INFO nova.virt.libvirt.driver [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:10:13 np0005539504 nova_compute[187152]: 2025-11-29 07:10:13.267 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:13 np0005539504 nova_compute[187152]: 2025-11-29 07:10:13.268 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:13 np0005539504 nova_compute[187152]: 2025-11-29 07:10:13.268 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:13 np0005539504 nova_compute[187152]: 2025-11-29 07:10:13.269 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:13 np0005539504 nova_compute[187152]: 2025-11-29 07:10:13.269 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:14 np0005539504 nova_compute[187152]: 2025-11-29 07:10:14.726 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:15 np0005539504 nova_compute[187152]: 2025-11-29 07:10:15.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:15 np0005539504 nova_compute[187152]: 2025-11-29 07:10:15.946 187156 INFO nova.compute.manager [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Terminating instance#033[00m
Nov 29 02:10:15 np0005539504 nova_compute[187152]: 2025-11-29 07:10:15.962 187156 DEBUG nova.compute.manager [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:10:15 np0005539504 kernel: tap18ad87ad-fe (unregistering): left promiscuous mode
Nov 29 02:10:15 np0005539504 NetworkManager[55210]: <info>  [1764400215.9939] device (tap18ad87ad-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:10:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:15Z|00324|binding|INFO|Releasing lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 from this chassis (sb_readonly=0)
Nov 29 02:10:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:15Z|00325|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 down in Southbound
Nov 29 02:10:15 np0005539504 nova_compute[187152]: 2025-11-29 07:10:15.996 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:15Z|00326|binding|INFO|Removing iface tap18ad87ad-fe ovn-installed in OVS
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.000 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:16 np0005539504 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000057.scope: Deactivated successfully.
Nov 29 02:10:16 np0005539504 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000057.scope: Consumed 10.559s CPU time.
Nov 29 02:10:16 np0005539504 systemd-machined[153423]: Machine qemu-46-instance-00000057 terminated.
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.252 187156 INFO nova.virt.libvirt.driver [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance destroyed successfully.#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.252 187156 DEBUG nova.objects.instance [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:16.320 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:06:41 10.100.0.10'], port_security=['fa:16:3e:77:06:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '23cc8968-d9b9-42dc-b458-0683a72a0194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '12', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=18ad87ad-fee6-484b-81da-6889ed2a9af1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:16.322 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:10:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:16.324 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:10:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:16.326 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0f44ec32-5d0c-4708-bff2-7df89f98f520]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:16.327 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.914 187156 DEBUG nova.virt.libvirt.vif [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:10:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.916 187156 DEBUG nova.network.os_vif_util [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.917 187156 DEBUG nova.network.os_vif_util [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.918 187156 DEBUG os_vif [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.922 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18ad87ad-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.925 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.930 187156 INFO os_vif [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe')#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.931 187156 INFO nova.virt.libvirt.driver [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Deleting instance files /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_del#033[00m
Nov 29 02:10:16 np0005539504 nova_compute[187152]: 2025-11-29 07:10:16.936 187156 INFO nova.virt.libvirt.driver [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Deletion of /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_del complete#033[00m
Nov 29 02:10:17 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [NOTICE]   (228717) : haproxy version is 2.8.14-c23fe91
Nov 29 02:10:17 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [NOTICE]   (228717) : path to executable is /usr/sbin/haproxy
Nov 29 02:10:17 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [WARNING]  (228717) : Exiting Master process...
Nov 29 02:10:17 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [WARNING]  (228717) : Exiting Master process...
Nov 29 02:10:17 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [ALERT]    (228717) : Current worker (228719) exited with code 143 (Terminated)
Nov 29 02:10:17 np0005539504 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228713]: [WARNING]  (228717) : All workers exited. Exiting... (0)
Nov 29 02:10:17 np0005539504 systemd[1]: libpod-eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93.scope: Deactivated successfully.
Nov 29 02:10:17 np0005539504 podman[228830]: 2025-11-29 07:10:17.08095586 +0000 UTC m=+0.621991573 container died eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:17 np0005539504 nova_compute[187152]: 2025-11-29 07:10:17.238 187156 INFO nova.virt.libvirt.driver [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Snapshot image upload complete#033[00m
Nov 29 02:10:17 np0005539504 nova_compute[187152]: 2025-11-29 07:10:17.239 187156 INFO nova.compute.manager [None req-02e2ab75-9228-435c-b7a1-44bfc6193a2a b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Took 10.42 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.031 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:20 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93-userdata-shm.mount: Deactivated successfully.
Nov 29 02:10:20 np0005539504 systemd[1]: var-lib-containers-storage-overlay-a991c92593107c6a8f63f049556dabe68ebdb0a20165a87c03a79ac948ec9ba8-merged.mount: Deactivated successfully.
Nov 29 02:10:20 np0005539504 podman[228875]: 2025-11-29 07:10:20.686154825 +0000 UTC m=+1.200184980 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:10:20 np0005539504 podman[228876]: 2025-11-29 07:10:20.697305589 +0000 UTC m=+1.209159178 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.734 187156 DEBUG nova.compute.manager [req-e70222fe-39db-4a13-a988-0e6ff8baef71 req-130c1578-43c9-405c-b1db-4d25c1c49f2c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.734 187156 DEBUG oslo_concurrency.lockutils [req-e70222fe-39db-4a13-a988-0e6ff8baef71 req-130c1578-43c9-405c-b1db-4d25c1c49f2c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.735 187156 DEBUG oslo_concurrency.lockutils [req-e70222fe-39db-4a13-a988-0e6ff8baef71 req-130c1578-43c9-405c-b1db-4d25c1c49f2c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.735 187156 DEBUG oslo_concurrency.lockutils [req-e70222fe-39db-4a13-a988-0e6ff8baef71 req-130c1578-43c9-405c-b1db-4d25c1c49f2c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.735 187156 DEBUG nova.compute.manager [req-e70222fe-39db-4a13-a988-0e6ff8baef71 req-130c1578-43c9-405c-b1db-4d25c1c49f2c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:20 np0005539504 nova_compute[187152]: 2025-11-29 07:10:20.736 187156 DEBUG nova.compute.manager [req-e70222fe-39db-4a13-a988-0e6ff8baef71 req-130c1578-43c9-405c-b1db-4d25c1c49f2c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:10:21 np0005539504 podman[228830]: 2025-11-29 07:10:21.123508821 +0000 UTC m=+4.664544504 container cleanup eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:10:21 np0005539504 systemd[1]: libpod-conmon-eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93.scope: Deactivated successfully.
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.136 187156 INFO nova.compute.manager [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Took 5.17 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.137 187156 DEBUG oslo.service.loopingcall [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.138 187156 DEBUG nova.compute.manager [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.138 187156 DEBUG nova.network.neutron [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:10:21 np0005539504 podman[228846]: 2025-11-29 07:10:21.608546448 +0000 UTC m=+4.488575176 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:10:21 np0005539504 podman[228926]: 2025-11-29 07:10:21.837265066 +0000 UTC m=+0.689751156 container remove eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.844 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[04d8a240-e121-4f01-ae27-510f5c5939c9]: (4, ('Sat Nov 29 07:10:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93)\neb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93\nSat Nov 29 07:10:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (eb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93)\neb8158795e0dfbc8de110edff6211dafb5de4e8a46d47a0211ebec8afcd48f93\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.846 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e33eef58-51b7-4cb4-b0a4-37ed236adf23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.847 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.850 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:21 np0005539504 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.861 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[771124f1-990a-4797-9676-335679a3f387]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.877 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1e35a08c-96b1-4709-81fa-275f3204faae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.878 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[33822cba-13ec-40a5-bc3e-b5fb6c04a59c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.900 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a78fecab-adb1-4ce8-84ec-b5aeb3832018]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566819, 'reachable_time': 37042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228947, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.904 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:10:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:21.904 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[386347e9-9191-453d-b964-9605d87983e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:21 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:10:21 np0005539504 nova_compute[187152]: 2025-11-29 07:10:21.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:22Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:18:ae:b2 10.100.0.8
Nov 29 02:10:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:22Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:18:ae:b2 10.100.0.8
Nov 29 02:10:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:22.927 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:22.928 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:22.928 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.846 187156 DEBUG nova.network.neutron [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.874 187156 INFO nova.compute.manager [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Took 2.74 seconds to deallocate network for instance.#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.969 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.970 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.974 187156 DEBUG nova.compute.manager [req-7d0ea96c-d8ab-4307-b21b-3fd793314b5a req-c76c8e58-70b1-45b7-a0ed-fb286a44648a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.975 187156 DEBUG oslo_concurrency.lockutils [req-7d0ea96c-d8ab-4307-b21b-3fd793314b5a req-c76c8e58-70b1-45b7-a0ed-fb286a44648a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.975 187156 DEBUG oslo_concurrency.lockutils [req-7d0ea96c-d8ab-4307-b21b-3fd793314b5a req-c76c8e58-70b1-45b7-a0ed-fb286a44648a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.975 187156 DEBUG oslo_concurrency.lockutils [req-7d0ea96c-d8ab-4307-b21b-3fd793314b5a req-c76c8e58-70b1-45b7-a0ed-fb286a44648a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.975 187156 DEBUG nova.compute.manager [req-7d0ea96c-d8ab-4307-b21b-3fd793314b5a req-c76c8e58-70b1-45b7-a0ed-fb286a44648a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:23 np0005539504 nova_compute[187152]: 2025-11-29 07:10:23.976 187156 WARNING nova.compute.manager [req-7d0ea96c-d8ab-4307-b21b-3fd793314b5a req-c76c8e58-70b1-45b7-a0ed-fb286a44648a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:10:24 np0005539504 nova_compute[187152]: 2025-11-29 07:10:24.087 187156 DEBUG nova.compute.provider_tree [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:24 np0005539504 nova_compute[187152]: 2025-11-29 07:10:24.106 187156 DEBUG nova.scheduler.client.report [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:24 np0005539504 nova_compute[187152]: 2025-11-29 07:10:24.137 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:24 np0005539504 nova_compute[187152]: 2025-11-29 07:10:24.171 187156 INFO nova.scheduler.client.report [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Deleted allocations for instance 23cc8968-d9b9-42dc-b458-0683a72a0194#033[00m
Nov 29 02:10:24 np0005539504 nova_compute[187152]: 2025-11-29 07:10:24.192 187156 DEBUG nova.compute.manager [req-f563e11a-ad0b-4951-8ad9-375ee12b5af5 req-074c75bf-3082-4f89-b878-245d147b1ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-deleted-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:24 np0005539504 nova_compute[187152]: 2025-11-29 07:10:24.277 187156 DEBUG oslo_concurrency.lockutils [None req-f788799b-e6d0-4bc1-b27e-f30c1454ac32 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:24 np0005539504 podman[228952]: 2025-11-29 07:10:24.724562282 +0000 UTC m=+0.059784752 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:10:24 np0005539504 podman[228953]: 2025-11-29 07:10:24.757285584 +0000 UTC m=+0.086591823 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:25 np0005539504 nova_compute[187152]: 2025-11-29 07:10:25.036 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:26 np0005539504 nova_compute[187152]: 2025-11-29 07:10:26.723 187156 DEBUG nova.compute.manager [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:26 np0005539504 nova_compute[187152]: 2025-11-29 07:10:26.824 187156 INFO nova.compute.manager [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] instance snapshotting#033[00m
Nov 29 02:10:26 np0005539504 nova_compute[187152]: 2025-11-29 07:10:26.926 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.045 187156 INFO nova.virt.libvirt.driver [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Beginning live snapshot process#033[00m
Nov 29 02:10:27 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.292 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.371 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.373 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.470 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c/disk --force-share --output=json -f qcow2" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.495 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.563 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.564 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp7cb8gnv2/1e3e40c1a21149f3859021bf21016da3.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.606 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp7cb8gnv2/1e3e40c1a21149f3859021bf21016da3.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.607 187156 INFO nova.virt.libvirt.driver [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:10:27 np0005539504 nova_compute[187152]: 2025-11-29 07:10:27.659 187156 DEBUG nova.virt.libvirt.guest [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:10:28 np0005539504 nova_compute[187152]: 2025-11-29 07:10:28.165 187156 DEBUG nova.virt.libvirt.guest [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:10:28 np0005539504 nova_compute[187152]: 2025-11-29 07:10:28.170 187156 INFO nova.virt.libvirt.driver [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:10:28 np0005539504 nova_compute[187152]: 2025-11-29 07:10:28.223 187156 DEBUG nova.privsep.utils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:10:28 np0005539504 nova_compute[187152]: 2025-11-29 07:10:28.224 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp7cb8gnv2/1e3e40c1a21149f3859021bf21016da3.delta /var/lib/nova/instances/snapshots/tmp7cb8gnv2/1e3e40c1a21149f3859021bf21016da3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:28 np0005539504 nova_compute[187152]: 2025-11-29 07:10:28.840 187156 DEBUG oslo_concurrency.processutils [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp7cb8gnv2/1e3e40c1a21149f3859021bf21016da3.delta /var/lib/nova/instances/snapshots/tmp7cb8gnv2/1e3e40c1a21149f3859021bf21016da3" returned: 0 in 0.616s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:28 np0005539504 nova_compute[187152]: 2025-11-29 07:10:28.847 187156 INFO nova.virt.libvirt.driver [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:10:30 np0005539504 nova_compute[187152]: 2025-11-29 07:10:30.075 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:30Z|00327|binding|INFO|Releasing lport 23ed72a2-c029-4964-a378-5eb188f1396a from this chassis (sb_readonly=0)
Nov 29 02:10:30 np0005539504 nova_compute[187152]: 2025-11-29 07:10:30.904 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:31Z|00328|binding|INFO|Releasing lport 23ed72a2-c029-4964-a378-5eb188f1396a from this chassis (sb_readonly=0)
Nov 29 02:10:31 np0005539504 nova_compute[187152]: 2025-11-29 07:10:31.063 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:31 np0005539504 nova_compute[187152]: 2025-11-29 07:10:31.250 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400216.2481172, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:31 np0005539504 nova_compute[187152]: 2025-11-29 07:10:31.251 187156 INFO nova.compute.manager [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:10:31 np0005539504 podman[229035]: 2025-11-29 07:10:31.760436674 +0000 UTC m=+0.097016230 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:10:31 np0005539504 nova_compute[187152]: 2025-11-29 07:10:31.928 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:32 np0005539504 nova_compute[187152]: 2025-11-29 07:10:32.004 187156 DEBUG nova.compute.manager [None req-ac4a0331-6c0c-488b-a08d-22f1f8ce8972 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:32 np0005539504 nova_compute[187152]: 2025-11-29 07:10:32.404 187156 INFO nova.virt.libvirt.driver [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Snapshot image upload complete#033[00m
Nov 29 02:10:32 np0005539504 nova_compute[187152]: 2025-11-29 07:10:32.405 187156 INFO nova.compute.manager [None req-1b88aaee-694e-43d7-a8b7-31c773a0557f b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Took 5.57 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.428 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.429 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.429 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.429 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.429 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.440 187156 INFO nova.compute.manager [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Terminating instance#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.449 187156 DEBUG nova.compute.manager [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:10:35 np0005539504 kernel: tap0f21aa98-09 (unregistering): left promiscuous mode
Nov 29 02:10:35 np0005539504 NetworkManager[55210]: <info>  [1764400235.4691] device (tap0f21aa98-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:10:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:35Z|00329|binding|INFO|Releasing lport 0f21aa98-0955-47ce-8d57-57a53b967dbd from this chassis (sb_readonly=0)
Nov 29 02:10:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:35Z|00330|binding|INFO|Setting lport 0f21aa98-0955-47ce-8d57-57a53b967dbd down in Southbound
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.476 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:10:35Z|00331|binding|INFO|Removing iface tap0f21aa98-09 ovn-installed in OVS
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.485 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:ae:b2 10.100.0.8'], port_security=['fa:16:3e:18:ae:b2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '26775c5b-f8ee-4576-b5c1-49f7cefff38c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31c17588-ffc8-47d5-a31a-8aad693c067c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd35b30df459c452a9c28cede8ac3666b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9306e0a1-6a16-462f-82a5-9784e972844b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c650f9d7-64cc-4240-ae67-25d102dd42ea, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=0f21aa98-0955-47ce-8d57-57a53b967dbd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.486 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 0f21aa98-0955-47ce-8d57-57a53b967dbd in datapath 31c17588-ffc8-47d5-a31a-8aad693c067c unbound from our chassis#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.487 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31c17588-ffc8-47d5-a31a-8aad693c067c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.488 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[15431b06-c1be-49d4-9323-f8cdc40087af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.489 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c namespace which is not needed anymore#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.494 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Nov 29 02:10:35 np0005539504 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005c.scope: Consumed 14.184s CPU time.
Nov 29 02:10:35 np0005539504 systemd-machined[153423]: Machine qemu-45-instance-0000005c terminated.
Nov 29 02:10:35 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [NOTICE]   (228586) : haproxy version is 2.8.14-c23fe91
Nov 29 02:10:35 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [NOTICE]   (228586) : path to executable is /usr/sbin/haproxy
Nov 29 02:10:35 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [WARNING]  (228586) : Exiting Master process...
Nov 29 02:10:35 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [WARNING]  (228586) : Exiting Master process...
Nov 29 02:10:35 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [ALERT]    (228586) : Current worker (228588) exited with code 143 (Terminated)
Nov 29 02:10:35 np0005539504 neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c[228582]: [WARNING]  (228586) : All workers exited. Exiting... (0)
Nov 29 02:10:35 np0005539504 systemd[1]: libpod-22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b.scope: Deactivated successfully.
Nov 29 02:10:35 np0005539504 podman[229080]: 2025-11-29 07:10:35.638560649 +0000 UTC m=+0.047727796 container died 22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:10:35 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b-userdata-shm.mount: Deactivated successfully.
Nov 29 02:10:35 np0005539504 systemd[1]: var-lib-containers-storage-overlay-e401dbe0f2ff0b8162cc3f4f85743f1eaf2fe68f2405e93b5b64e427056ded8a-merged.mount: Deactivated successfully.
Nov 29 02:10:35 np0005539504 podman[229080]: 2025-11-29 07:10:35.686122649 +0000 UTC m=+0.095289786 container cleanup 22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:10:35 np0005539504 systemd[1]: libpod-conmon-22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b.scope: Deactivated successfully.
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.721 187156 INFO nova.virt.libvirt.driver [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Instance destroyed successfully.#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.721 187156 DEBUG nova.objects.instance [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lazy-loading 'resources' on Instance uuid 26775c5b-f8ee-4576-b5c1-49f7cefff38c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.736 187156 DEBUG nova.virt.libvirt.vif [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:09:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1290812695',display_name='tempest-ImagesOneServerTestJSON-server-1290812695',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1290812695',id=92,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d35b30df459c452a9c28cede8ac3666b',ramdisk_id='',reservation_id='r-n212rgr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-1137583721',owner_user_name='tempest-ImagesOneServerTestJSON-1137583721-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:10:32Z,user_data=None,user_id='b3503486e3f043388733bf6cbb6c4521',uuid=26775c5b-f8ee-4576-b5c1-49f7cefff38c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.737 187156 DEBUG nova.network.os_vif_util [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Converting VIF {"id": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "address": "fa:16:3e:18:ae:b2", "network": {"id": "31c17588-ffc8-47d5-a31a-8aad693c067c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1879386847-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d35b30df459c452a9c28cede8ac3666b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f21aa98-09", "ovs_interfaceid": "0f21aa98-0955-47ce-8d57-57a53b967dbd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.738 187156 DEBUG nova.network.os_vif_util [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.739 187156 DEBUG os_vif [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.741 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.742 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f21aa98-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.743 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.745 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.748 187156 INFO os_vif [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:ae:b2,bridge_name='br-int',has_traffic_filtering=True,id=0f21aa98-0955-47ce-8d57-57a53b967dbd,network=Network(31c17588-ffc8-47d5-a31a-8aad693c067c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f21aa98-09')#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.748 187156 INFO nova.virt.libvirt.driver [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Deleting instance files /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c_del#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.749 187156 INFO nova.virt.libvirt.driver [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Deletion of /var/lib/nova/instances/26775c5b-f8ee-4576-b5c1-49f7cefff38c_del complete#033[00m
Nov 29 02:10:35 np0005539504 podman[229119]: 2025-11-29 07:10:35.897043394 +0000 UTC m=+0.185911940 container remove 22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.906 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f05b0cf8-2100-4302-b451-2b31afdb4f64]: (4, ('Sat Nov 29 07:10:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c (22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b)\n22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b\nSat Nov 29 07:10:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c (22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b)\n22bfcea26957eb080ae5ef9208e847c5bee66e3c944c2e5e5ee0eeb5ac006d7b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.908 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5c22ec84-7ca9-4584-abf8-f17448dba351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.910 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31c17588-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:35 np0005539504 kernel: tap31c17588-f0: left promiscuous mode
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.947 187156 INFO nova.compute.manager [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.948 187156 DEBUG oslo.service.loopingcall [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.948 187156 DEBUG nova.compute.manager [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.948 187156 DEBUG nova.network.neutron [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:10:35 np0005539504 nova_compute[187152]: 2025-11-29 07:10:35.958 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.961 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[db75d11b-a06e-413f-b35a-034e78b4052e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.981 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2205ba82-d166-45ec-aded-e67b293f479e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.983 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[48c07176-f75c-4616-b2df-8aed35d0f804]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:35.997 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[19937ef8-f4f9-4151-96dc-118a4d19223f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 566532, 'reachable_time': 23188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229138, 'error': None, 'target': 'ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:36.000 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-31c17588-ffc8-47d5-a31a-8aad693c067c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:10:36 np0005539504 systemd[1]: run-netns-ovnmeta\x2d31c17588\x2dffc8\x2d47d5\x2da31a\x2d8aad693c067c.mount: Deactivated successfully.
Nov 29 02:10:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:10:36.000 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fc3674-5f7e-4605-8cdf-7945394caa59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.179 187156 DEBUG nova.compute.manager [req-867f70f9-a873-4482-a6d9-967b4ad1bc37 req-d6597eeb-595b-4912-a885-9bfce4973942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-vif-unplugged-0f21aa98-0955-47ce-8d57-57a53b967dbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.180 187156 DEBUG oslo_concurrency.lockutils [req-867f70f9-a873-4482-a6d9-967b4ad1bc37 req-d6597eeb-595b-4912-a885-9bfce4973942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.180 187156 DEBUG oslo_concurrency.lockutils [req-867f70f9-a873-4482-a6d9-967b4ad1bc37 req-d6597eeb-595b-4912-a885-9bfce4973942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.180 187156 DEBUG oslo_concurrency.lockutils [req-867f70f9-a873-4482-a6d9-967b4ad1bc37 req-d6597eeb-595b-4912-a885-9bfce4973942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.180 187156 DEBUG nova.compute.manager [req-867f70f9-a873-4482-a6d9-967b4ad1bc37 req-d6597eeb-595b-4912-a885-9bfce4973942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] No waiting events found dispatching network-vif-unplugged-0f21aa98-0955-47ce-8d57-57a53b967dbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.180 187156 DEBUG nova.compute.manager [req-867f70f9-a873-4482-a6d9-967b4ad1bc37 req-d6597eeb-595b-4912-a885-9bfce4973942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-vif-unplugged-0f21aa98-0955-47ce-8d57-57a53b967dbd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.553 187156 DEBUG nova.network.neutron [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.572 187156 INFO nova.compute.manager [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Took 0.62 seconds to deallocate network for instance.#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.632 187156 DEBUG nova.compute.manager [req-92dcca3d-9bc9-49e7-82dd-545ab03023cb req-da59215b-056d-472e-a453-76eb7744f9b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-vif-deleted-0f21aa98-0955-47ce-8d57-57a53b967dbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.653 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.653 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.742 187156 DEBUG nova.compute.provider_tree [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.766 187156 DEBUG nova.scheduler.client.report [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.803 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.844 187156 INFO nova.scheduler.client.report [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Deleted allocations for instance 26775c5b-f8ee-4576-b5c1-49f7cefff38c#033[00m
Nov 29 02:10:36 np0005539504 nova_compute[187152]: 2025-11-29 07:10:36.937 187156 DEBUG oslo_concurrency.lockutils [None req-bd77ab61-1929-43ad-8498-c90a4af752a9 b3503486e3f043388733bf6cbb6c4521 d35b30df459c452a9c28cede8ac3666b - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:38 np0005539504 podman[229139]: 2025-11-29 07:10:38.731553396 +0000 UTC m=+0.072221669 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:39 np0005539504 nova_compute[187152]: 2025-11-29 07:10:39.826 187156 DEBUG nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:39 np0005539504 nova_compute[187152]: 2025-11-29 07:10:39.826 187156 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:39 np0005539504 nova_compute[187152]: 2025-11-29 07:10:39.826 187156 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:39 np0005539504 nova_compute[187152]: 2025-11-29 07:10:39.826 187156 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "26775c5b-f8ee-4576-b5c1-49f7cefff38c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:39 np0005539504 nova_compute[187152]: 2025-11-29 07:10:39.826 187156 DEBUG nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] No waiting events found dispatching network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:39 np0005539504 nova_compute[187152]: 2025-11-29 07:10:39.827 187156 WARNING nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Received unexpected event network-vif-plugged-0f21aa98-0955-47ce-8d57-57a53b967dbd for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:10:40 np0005539504 nova_compute[187152]: 2025-11-29 07:10:40.080 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:40 np0005539504 nova_compute[187152]: 2025-11-29 07:10:40.744 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:43 np0005539504 nova_compute[187152]: 2025-11-29 07:10:43.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:45 np0005539504 nova_compute[187152]: 2025-11-29 07:10:45.083 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:45 np0005539504 nova_compute[187152]: 2025-11-29 07:10:45.746 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:10:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539504 nova_compute[187152]: 2025-11-29 07:10:48.153 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:50 np0005539504 nova_compute[187152]: 2025-11-29 07:10:50.086 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:50 np0005539504 nova_compute[187152]: 2025-11-29 07:10:50.719 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400235.718616, 26775c5b-f8ee-4576-b5c1-49f7cefff38c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:50 np0005539504 nova_compute[187152]: 2025-11-29 07:10:50.720 187156 INFO nova.compute.manager [-] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:10:50 np0005539504 nova_compute[187152]: 2025-11-29 07:10:50.748 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:51 np0005539504 podman[229160]: 2025-11-29 07:10:51.721576803 +0000 UTC m=+0.053650886 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:10:51 np0005539504 podman[229161]: 2025-11-29 07:10:51.732322496 +0000 UTC m=+0.060111780 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal)
Nov 29 02:10:52 np0005539504 nova_compute[187152]: 2025-11-29 07:10:52.070 187156 DEBUG nova.compute.manager [None req-7cfe4d97-966e-4734-9ac7-d37fc61b4a16 - - - - - -] [instance: 26775c5b-f8ee-4576-b5c1-49f7cefff38c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:52 np0005539504 podman[229204]: 2025-11-29 07:10:52.728487555 +0000 UTC m=+0.063810653 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:54 np0005539504 nova_compute[187152]: 2025-11-29 07:10:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:54 np0005539504 nova_compute[187152]: 2025-11-29 07:10:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:55 np0005539504 nova_compute[187152]: 2025-11-29 07:10:55.088 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:55 np0005539504 podman[229223]: 2025-11-29 07:10:55.70855266 +0000 UTC m=+0.053123712 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:10:55 np0005539504 nova_compute[187152]: 2025-11-29 07:10:55.749 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:55 np0005539504 podman[229224]: 2025-11-29 07:10:55.770139547 +0000 UTC m=+0.099579364 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:10:55 np0005539504 nova_compute[187152]: 2025-11-29 07:10:55.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:55 np0005539504 nova_compute[187152]: 2025-11-29 07:10:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:57 np0005539504 nova_compute[187152]: 2025-11-29 07:10:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:57 np0005539504 nova_compute[187152]: 2025-11-29 07:10:57.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:10:59 np0005539504 nova_compute[187152]: 2025-11-29 07:10:59.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:59 np0005539504 nova_compute[187152]: 2025-11-29 07:10:59.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:10:59 np0005539504 nova_compute[187152]: 2025-11-29 07:10:59.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:11:00 np0005539504 nova_compute[187152]: 2025-11-29 07:11:00.089 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:00 np0005539504 nova_compute[187152]: 2025-11-29 07:11:00.750 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:02 np0005539504 podman[229272]: 2025-11-29 07:11:02.725767148 +0000 UTC m=+0.066735029 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.078 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.079 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.079 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.152 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.152 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.152 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.153 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.316 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.317 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5723MB free_disk=73.19313049316406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.318 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.318 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.509 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.510 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.551 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.576 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.612 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:11:04 np0005539504 nova_compute[187152]: 2025-11-29 07:11:04.612 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:11:05 np0005539504 nova_compute[187152]: 2025-11-29 07:11:05.091 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:11:05.742 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:11:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:11:05.743 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:11:05 np0005539504 nova_compute[187152]: 2025-11-29 07:11:05.743 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:05 np0005539504 nova_compute[187152]: 2025-11-29 07:11:05.752 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:09 np0005539504 podman[229293]: 2025-11-29 07:11:09.739871486 +0000 UTC m=+0.076329742 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:11:10 np0005539504 nova_compute[187152]: 2025-11-29 07:11:10.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:11:10.747 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:11:10 np0005539504 nova_compute[187152]: 2025-11-29 07:11:10.754 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:15 np0005539504 nova_compute[187152]: 2025-11-29 07:11:15.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:15 np0005539504 nova_compute[187152]: 2025-11-29 07:11:15.757 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:20 np0005539504 nova_compute[187152]: 2025-11-29 07:11:20.150 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:20 np0005539504 nova_compute[187152]: 2025-11-29 07:11:20.759 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:22 np0005539504 podman[229313]: 2025-11-29 07:11:22.733057253 +0000 UTC m=+0.064242485 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:11:22 np0005539504 podman[229314]: 2025-11-29 07:11:22.742039721 +0000 UTC m=+0.073222123 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 29 02:11:22 np0005539504 podman[229358]: 2025-11-29 07:11:22.824501609 +0000 UTC m=+0.058552230 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:11:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:11:22.928 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:11:22.929 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:11:22.929 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:11:25 np0005539504 nova_compute[187152]: 2025-11-29 07:11:25.152 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:25 np0005539504 nova_compute[187152]: 2025-11-29 07:11:25.762 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:26 np0005539504 podman[229376]: 2025-11-29 07:11:26.700265387 +0000 UTC m=+0.045806106 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:11:26 np0005539504 podman[229377]: 2025-11-29 07:11:26.728894666 +0000 UTC m=+0.069048067 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:11:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:11:28Z|00332|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 29 02:11:30 np0005539504 nova_compute[187152]: 2025-11-29 07:11:30.153 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:30 np0005539504 nova_compute[187152]: 2025-11-29 07:11:30.764 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:33 np0005539504 podman[229427]: 2025-11-29 07:11:33.724178075 +0000 UTC m=+0.064698747 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:11:35 np0005539504 nova_compute[187152]: 2025-11-29 07:11:35.154 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:35 np0005539504 nova_compute[187152]: 2025-11-29 07:11:35.765 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:40 np0005539504 nova_compute[187152]: 2025-11-29 07:11:40.157 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:40 np0005539504 podman[229447]: 2025-11-29 07:11:40.730424673 +0000 UTC m=+0.065970459 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:11:40 np0005539504 nova_compute[187152]: 2025-11-29 07:11:40.767 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:42 np0005539504 nova_compute[187152]: 2025-11-29 07:11:42.779 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:42 np0005539504 nova_compute[187152]: 2025-11-29 07:11:42.780 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:44 np0005539504 nova_compute[187152]: 2025-11-29 07:11:44.644 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:11:45 np0005539504 nova_compute[187152]: 2025-11-29 07:11:45.158 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:45 np0005539504 nova_compute[187152]: 2025-11-29 07:11:45.770 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:47 np0005539504 nova_compute[187152]: 2025-11-29 07:11:47.471 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:50 np0005539504 nova_compute[187152]: 2025-11-29 07:11:50.161 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:50 np0005539504 nova_compute[187152]: 2025-11-29 07:11:50.772 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:53 np0005539504 podman[229469]: 2025-11-29 07:11:53.738983082 +0000 UTC m=+0.062856380 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:11:53 np0005539504 podman[229468]: 2025-11-29 07:11:53.743607319 +0000 UTC m=+0.087349012 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:11:53 np0005539504 podman[229470]: 2025-11-29 07:11:53.746303088 +0000 UTC m=+0.058682284 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:11:55 np0005539504 nova_compute[187152]: 2025-11-29 07:11:55.164 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:55 np0005539504 nova_compute[187152]: 2025-11-29 07:11:55.788 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:55 np0005539504 nova_compute[187152]: 2025-11-29 07:11:55.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:55 np0005539504 nova_compute[187152]: 2025-11-29 07:11:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:56 np0005539504 nova_compute[187152]: 2025-11-29 07:11:56.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:57 np0005539504 podman[229529]: 2025-11-29 07:11:57.706474714 +0000 UTC m=+0.048547157 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:11:57 np0005539504 podman[229530]: 2025-11-29 07:11:57.773440976 +0000 UTC m=+0.106572731 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:11:57 np0005539504 nova_compute[187152]: 2025-11-29 07:11:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:57 np0005539504 nova_compute[187152]: 2025-11-29 07:11:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:57 np0005539504 nova_compute[187152]: 2025-11-29 07:11:57.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:11:59 np0005539504 nova_compute[187152]: 2025-11-29 07:11:59.028 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:59 np0005539504 nova_compute[187152]: 2025-11-29 07:11:59.029 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:59 np0005539504 nova_compute[187152]: 2025-11-29 07:11:59.046 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:11:59 np0005539504 nova_compute[187152]: 2025-11-29 07:11:59.046 187156 INFO nova.compute.claims [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:12:00 np0005539504 nova_compute[187152]: 2025-11-29 07:12:00.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:00 np0005539504 nova_compute[187152]: 2025-11-29 07:12:00.519 187156 DEBUG nova.compute.provider_tree [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:12:00 np0005539504 nova_compute[187152]: 2025-11-29 07:12:00.580 187156 DEBUG nova.scheduler.client.report [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:12:00 np0005539504 nova_compute[187152]: 2025-11-29 07:12:00.791 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:00 np0005539504 nova_compute[187152]: 2025-11-29 07:12:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:01 np0005539504 nova_compute[187152]: 2025-11-29 07:12:01.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:01 np0005539504 nova_compute[187152]: 2025-11-29 07:12:01.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:12:01 np0005539504 nova_compute[187152]: 2025-11-29 07:12:01.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:12:04 np0005539504 podman[229579]: 2025-11-29 07:12:04.706624596 +0000 UTC m=+0.055007970 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 29 02:12:04 np0005539504 nova_compute[187152]: 2025-11-29 07:12:04.752 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:12:04 np0005539504 nova_compute[187152]: 2025-11-29 07:12:04.752 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:12:04 np0005539504 nova_compute[187152]: 2025-11-29 07:12:04.753 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:05 np0005539504 nova_compute[187152]: 2025-11-29 07:12:05.199 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:05 np0005539504 nova_compute[187152]: 2025-11-29 07:12:05.794 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:09 np0005539504 nova_compute[187152]: 2025-11-29 07:12:09.454 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 10.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:09 np0005539504 nova_compute[187152]: 2025-11-29 07:12:09.455 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:12:10 np0005539504 nova_compute[187152]: 2025-11-29 07:12:10.200 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:10 np0005539504 nova_compute[187152]: 2025-11-29 07:12:10.795 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:11 np0005539504 podman[229600]: 2025-11-29 07:12:11.704121332 +0000 UTC m=+0.053785129 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:12:12 np0005539504 nova_compute[187152]: 2025-11-29 07:12:12.174 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:12 np0005539504 nova_compute[187152]: 2025-11-29 07:12:12.174 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:15 np0005539504 nova_compute[187152]: 2025-11-29 07:12:15.204 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:15 np0005539504 nova_compute[187152]: 2025-11-29 07:12:15.798 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:20 np0005539504 nova_compute[187152]: 2025-11-29 07:12:20.206 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:20 np0005539504 nova_compute[187152]: 2025-11-29 07:12:20.801 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:22.929 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:22.930 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:22.930 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.734 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.734 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.735 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.735 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.890 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.892 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5741MB free_disk=73.19313049316406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.892 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:23 np0005539504 nova_compute[187152]: 2025-11-29 07:12:23.892 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.054 187156 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 4.83 sec#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.467 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.467 187156 DEBUG nova.network.neutron [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.538 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.552 187156 INFO nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.600 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.629 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.634 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 084a0f8e-19b7-4b24-a503-c015b26addbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.635 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.635 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:12:24 np0005539504 podman[229622]: 2025-11-29 07:12:24.716463153 +0000 UTC m=+0.059354140 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:12:24 np0005539504 podman[229624]: 2025-11-29 07:12:24.723521743 +0000 UTC m=+0.057341129 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 02:12:24 np0005539504 podman[229623]: 2025-11-29 07:12:24.723652237 +0000 UTC m=+0.062586963 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 29 02:12:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:24.762 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:12:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:24.764 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.764 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.841 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:24 np0005539504 nova_compute[187152]: 2025-11-29 07:12:24.935 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.207 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.248 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.400 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.400 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.401 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.401 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.405 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.405 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.409 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.409 187156 INFO nova.compute.claims [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.455 187156 DEBUG nova.policy [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.509 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.510 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.510 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.527 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.528 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.528 187156 INFO nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Creating image(s)#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.529 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.529 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.530 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.543 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.568 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.608 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.609 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.610 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.624 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.685 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.686 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.736 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.737 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.737 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.801 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.803 187156 DEBUG nova.virt.disk.api [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.803 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.823 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.830 187156 DEBUG nova.compute.provider_tree [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.857 187156 DEBUG nova.scheduler.client.report [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.867 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.868 187156 DEBUG nova.virt.disk.api [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.868 187156 DEBUG nova.objects.instance [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.901 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.902 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Ensure instance console log exists: /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.902 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.902 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.903 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.908 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:25 np0005539504 nova_compute[187152]: 2025-11-29 07:12:25.909 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:12:26 np0005539504 nova_compute[187152]: 2025-11-29 07:12:26.032 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:12:26 np0005539504 nova_compute[187152]: 2025-11-29 07:12:26.034 187156 DEBUG nova.network.neutron [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:12:26 np0005539504 nova_compute[187152]: 2025-11-29 07:12:26.086 187156 INFO nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:12:26 np0005539504 nova_compute[187152]: 2025-11-29 07:12:26.121 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:12:26 np0005539504 nova_compute[187152]: 2025-11-29 07:12:26.592 187156 DEBUG nova.policy [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:12:27 np0005539504 nova_compute[187152]: 2025-11-29 07:12:27.603 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:28 np0005539504 podman[229700]: 2025-11-29 07:12:28.720191687 +0000 UTC m=+0.066005490 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:12:28 np0005539504 podman[229701]: 2025-11-29 07:12:28.744160697 +0000 UTC m=+0.085679860 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.151 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.152 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.152 187156 INFO nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Creating image(s)#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.153 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.153 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.154 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.170 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.235 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.236 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.236 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.251 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.317 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.319 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.381 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.382 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.383 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.440 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.441 187156 DEBUG nova.virt.disk.api [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Checking if we can resize image /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.442 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.505 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.506 187156 DEBUG nova.virt.disk.api [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Cannot resize image /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.506 187156 DEBUG nova.objects.instance [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.675 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.676 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Ensure instance console log exists: /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.676 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.677 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:29 np0005539504 nova_compute[187152]: 2025-11-29 07:12:29.677 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:30 np0005539504 nova_compute[187152]: 2025-11-29 07:12:30.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:30.766 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:30 np0005539504 nova_compute[187152]: 2025-11-29 07:12:30.826 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:34 np0005539504 nova_compute[187152]: 2025-11-29 07:12:34.672 187156 DEBUG nova.network.neutron [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Successfully created port: 07a930ef-a036-4ddf-aa57-c5d56f77847c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:12:35 np0005539504 nova_compute[187152]: 2025-11-29 07:12:35.210 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:35 np0005539504 podman[229767]: 2025-11-29 07:12:35.725275477 +0000 UTC m=+0.068419732 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:12:35 np0005539504 nova_compute[187152]: 2025-11-29 07:12:35.849 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:37 np0005539504 nova_compute[187152]: 2025-11-29 07:12:37.156 187156 DEBUG nova.network.neutron [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Successfully created port: 60943dec-d420-449f-abc3-233df163ebed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:12:40 np0005539504 nova_compute[187152]: 2025-11-29 07:12:40.212 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:40 np0005539504 nova_compute[187152]: 2025-11-29 07:12:40.850 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:42 np0005539504 podman[229787]: 2025-11-29 07:12:42.711762302 +0000 UTC m=+0.059983527 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:12:43 np0005539504 nova_compute[187152]: 2025-11-29 07:12:43.654 187156 DEBUG nova.network.neutron [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Successfully updated port: 07a930ef-a036-4ddf-aa57-c5d56f77847c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.240 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.241 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.241 187156 DEBUG nova.network.neutron [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.290 187156 DEBUG nova.compute.manager [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.291 187156 DEBUG nova.compute.manager [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.291 187156 DEBUG oslo_concurrency.lockutils [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.462 187156 DEBUG nova.network.neutron [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Successfully updated port: 60943dec-d420-449f-abc3-233df163ebed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.670 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.670 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:12:44 np0005539504 nova_compute[187152]: 2025-11-29 07:12:44.671 187156 DEBUG nova.network.neutron [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:12:45 np0005539504 nova_compute[187152]: 2025-11-29 07:12:45.214 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:45 np0005539504 nova_compute[187152]: 2025-11-29 07:12:45.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:46 np0005539504 nova_compute[187152]: 2025-11-29 07:12:46.101 187156 DEBUG nova.compute.manager [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-changed-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:12:46 np0005539504 nova_compute[187152]: 2025-11-29 07:12:46.102 187156 DEBUG nova.compute.manager [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing instance network info cache due to event network-changed-60943dec-d420-449f-abc3-233df163ebed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:12:46 np0005539504 nova_compute[187152]: 2025-11-29 07:12:46.102 187156 DEBUG oslo_concurrency.lockutils [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:12:47 np0005539504 nova_compute[187152]: 2025-11-29 07:12:47.097 187156 DEBUG nova.network.neutron [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:12:47 np0005539504 nova_compute[187152]: 2025-11-29 07:12:47.105 187156 DEBUG nova.network.neutron [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.965 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.966 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.967 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.968 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:12:47.969 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539504 nova_compute[187152]: 2025-11-29 07:12:48.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:50 np0005539504 nova_compute[187152]: 2025-11-29 07:12:50.216 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:50 np0005539504 nova_compute[187152]: 2025-11-29 07:12:50.884 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:51 np0005539504 nova_compute[187152]: 2025-11-29 07:12:51.640 187156 DEBUG nova.network.neutron [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:12:51 np0005539504 nova_compute[187152]: 2025-11-29 07:12:51.657 187156 DEBUG nova.network.neutron [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.412 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.413 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance network_info: |[{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.414 187156 DEBUG oslo_concurrency.lockutils [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.414 187156 DEBUG nova.network.neutron [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing network info cache for port 60943dec-d420-449f-abc3-233df163ebed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.416 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start _get_guest_xml network_info=[{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.420 187156 WARNING nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.433 187156 DEBUG nova.virt.libvirt.host [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.434 187156 DEBUG nova.virt.libvirt.host [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.437 187156 DEBUG nova.virt.libvirt.host [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.438 187156 DEBUG nova.virt.libvirt.host [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.439 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.439 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.440 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.440 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.440 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.440 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.441 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.441 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.441 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.441 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.442 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.442 187156 DEBUG nova.virt.hardware [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.446 187156 DEBUG nova.virt.libvirt.vif [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:12:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.446 187156 DEBUG nova.network.os_vif_util [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.447 187156 DEBUG nova.network.os_vif_util [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.448 187156 DEBUG nova.objects.instance [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.608 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.609 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance network_info: |[{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.609 187156 DEBUG oslo_concurrency.lockutils [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.610 187156 DEBUG nova.network.neutron [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.612 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start _get_guest_xml network_info=[{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.617 187156 WARNING nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.628 187156 DEBUG nova.virt.libvirt.host [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.629 187156 DEBUG nova.virt.libvirt.host [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.632 187156 DEBUG nova.virt.libvirt.host [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.632 187156 DEBUG nova.virt.libvirt.host [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.633 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.634 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.634 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.634 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.634 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.635 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.635 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.635 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.635 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.636 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.636 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.636 187156 DEBUG nova.virt.hardware [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.639 187156 DEBUG nova.virt.libvirt.vif [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:12:24Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.640 187156 DEBUG nova.network.os_vif_util [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.640 187156 DEBUG nova.network.os_vif_util [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.641 187156 DEBUG nova.objects.instance [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.775 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <uuid>084a0f8e-19b7-4b24-a503-c015b26addbc</uuid>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <name>instance-00000060</name>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestOtherB-server-734207825</nova:name>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:12:52</nova:creationTime>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:port uuid="60943dec-d420-449f-abc3-233df163ebed">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="serial">084a0f8e-19b7-4b24-a503-c015b26addbc</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="uuid">084a0f8e-19b7-4b24-a503-c015b26addbc</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:04:06:9e"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <target dev="tap60943dec-d4"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/console.log" append="off"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:12:52 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:12:52 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.784 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Preparing to wait for external event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.785 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.786 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.786 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.788 187156 DEBUG nova.virt.libvirt.vif [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:12:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.788 187156 DEBUG nova.network.os_vif_util [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.790 187156 DEBUG nova.network.os_vif_util [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.791 187156 DEBUG os_vif [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.792 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.793 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.794 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.800 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60943dec-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.801 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60943dec-d4, col_values=(('external_ids', {'iface-id': '60943dec-d420-449f-abc3-233df163ebed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:06:9e', 'vm-uuid': '084a0f8e-19b7-4b24-a503-c015b26addbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.804 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 NetworkManager[55210]: <info>  [1764400372.8069] manager: (tap60943dec-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.808 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.812 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.813 187156 INFO os_vif [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.908 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <uuid>5818027f-a5b1-465a-a6e2-f0c8f0de8154</uuid>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <name>instance-0000005f</name>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-409239588</nova:name>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:12:52</nova:creationTime>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        <nova:port uuid="07a930ef-a036-4ddf-aa57-c5d56f77847c">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="serial">5818027f-a5b1-465a-a6e2-f0c8f0de8154</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="uuid">5818027f-a5b1-465a-a6e2-f0c8f0de8154</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:c4:6c:2d"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <target dev="tap07a930ef-a0"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/console.log" append="off"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:12:52 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:12:52 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:12:52 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:12:52 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.909 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Preparing to wait for external event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.909 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.910 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.910 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.910 187156 DEBUG nova.virt.libvirt.vif [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:12:24Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.911 187156 DEBUG nova.network.os_vif_util [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.912 187156 DEBUG nova.network.os_vif_util [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.912 187156 DEBUG os_vif [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.912 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.913 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.913 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.918 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.918 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07a930ef-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.919 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap07a930ef-a0, col_values=(('external_ids', {'iface-id': '07a930ef-a036-4ddf-aa57-c5d56f77847c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:6c:2d', 'vm-uuid': '5818027f-a5b1-465a-a6e2-f0c8f0de8154'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 NetworkManager[55210]: <info>  [1764400372.9214] manager: (tap07a930ef-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.928 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.929 187156 INFO os_vif [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0')#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.988 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.989 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.989 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No VIF found with MAC fa:16:3e:04:06:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:12:52 np0005539504 nova_compute[187152]: 2025-11-29 07:12:52.991 187156 INFO nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Using config drive#033[00m
Nov 29 02:12:53 np0005539504 nova_compute[187152]: 2025-11-29 07:12:53.210 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:12:53 np0005539504 nova_compute[187152]: 2025-11-29 07:12:53.211 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:12:53 np0005539504 nova_compute[187152]: 2025-11-29 07:12:53.212 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:c4:6c:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:12:53 np0005539504 nova_compute[187152]: 2025-11-29 07:12:53.213 187156 INFO nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Using config drive#033[00m
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.630 187156 INFO nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Creating config drive at /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config#033[00m
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.644 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpafqikcwc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.776 187156 DEBUG oslo_concurrency.processutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpafqikcwc" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:54 np0005539504 kernel: tap60943dec-d4: entered promiscuous mode
Nov 29 02:12:54 np0005539504 NetworkManager[55210]: <info>  [1764400374.8757] manager: (tap60943dec-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:54Z|00333|binding|INFO|Claiming lport 60943dec-d420-449f-abc3-233df163ebed for this chassis.
Nov 29 02:12:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:54Z|00334|binding|INFO|60943dec-d420-449f-abc3-233df163ebed: Claiming fa:16:3e:04:06:9e 10.100.0.9
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.888 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.894 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:54 np0005539504 systemd-udevd[229868]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:12:54 np0005539504 podman[229823]: 2025-11-29 07:12:54.931751092 +0000 UTC m=+0.066126884 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Nov 29 02:12:54 np0005539504 NetworkManager[55210]: <info>  [1764400374.9338] device (tap60943dec-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:12:54 np0005539504 NetworkManager[55210]: <info>  [1764400374.9349] device (tap60943dec-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:12:54 np0005539504 systemd-machined[153423]: New machine qemu-47-instance-00000060.
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.948 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:54 np0005539504 podman[229822]: 2025-11-29 07:12:54.950071398 +0000 UTC m=+0.086628255 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:12:54 np0005539504 podman[229824]: 2025-11-29 07:12:54.9501723 +0000 UTC m=+0.076714932 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:12:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:54Z|00335|binding|INFO|Setting lport 60943dec-d420-449f-abc3-233df163ebed ovn-installed in OVS
Nov 29 02:12:54 np0005539504 systemd[1]: Started Virtual Machine qemu-47-instance-00000060.
Nov 29 02:12:54 np0005539504 nova_compute[187152]: 2025-11-29 07:12:54.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.217 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:55Z|00336|binding|INFO|Setting lport 60943dec-d420-449f-abc3-233df163ebed up in Southbound
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.237 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:06:9e 10.100.0.9'], port_security=['fa:16:3e:04:06:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60943dec-d420-449f-abc3-233df163ebed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.238 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60943dec-d420-449f-abc3-233df163ebed in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 bound to our chassis#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.240 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.255 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2d12a779-f2d1-4985-995b-4a2c2a0a2819]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.257 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf7cfc35-31 in ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.259 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf7cfc35-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.260 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[114ba6c8-0d09-4fdc-9fd5-62fae77b8737]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.261 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf26c68-c1f6-4303-9671-a629f405c83b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.273 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[6ecdc0f6-2ab7-4bc2-ba14-a98d5869f1c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.286 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2f27d06e-e1b7-48f7-aaa0-80aa35651eb7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.306 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400375.305336, 084a0f8e-19b7-4b24-a503-c015b26addbc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.306 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Started (Lifecycle Event)#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.319 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d7e3b5-0911-498a-8cbf-792003bdeb7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.325 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[23dac8a0-8a8b-42ac-aec5-09a16ffc5c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 NetworkManager[55210]: <info>  [1764400375.3266] manager: (tapdf7cfc35-30): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Nov 29 02:12:55 np0005539504 systemd-udevd[229884]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.355 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ed2ccb-ffe4-4ff4-9e8d-4af3362ba803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.358 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1f878a28-6253-4fad-890c-bef9f6490705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 NetworkManager[55210]: <info>  [1764400375.3818] device (tapdf7cfc35-30): carrier: link connected
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.385 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef85515-0bf6-481f-8190-61852104e8cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.403 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dff79808-ae8b-425b-94e7-f6857482c650]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583826, 'reachable_time': 28400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229939, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.404 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.408 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400375.3055193, 084a0f8e-19b7-4b24-a503-c015b26addbc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.409 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.422 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[db3dce00-49e1-4086-b158-f121b360ee5f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:aeb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583826, 'tstamp': 583826}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229940, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.433 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.436 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.441 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[da53f45e-2158-4afb-b228-1dce7640881c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583826, 'reachable_time': 28400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229941, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.473 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0d03113c-85d6-4a9f-bdab-4920538bdda3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.485 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.531 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[14f01cb8-878a-461c-af42-ffe81d46612b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.533 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.533 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.533 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.535 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 NetworkManager[55210]: <info>  [1764400375.5359] manager: (tapdf7cfc35-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Nov 29 02:12:55 np0005539504 kernel: tapdf7cfc35-30: entered promiscuous mode
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.540 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.541 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.542 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:55Z|00337|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.554 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.558 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.558 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.559 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb1a272-fd4d-48c2-ba8c-caea7dc32182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.561 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.563 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'env', 'PROCESS_TAG=haproxy-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df7cfc35-3f76-45b2-b70c-e4525d38f410.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.632 187156 INFO nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Creating config drive at /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.637 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps9awl3ez execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.763 187156 DEBUG oslo_concurrency.processutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps9awl3ez" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:12:55 np0005539504 kernel: tap07a930ef-a0: entered promiscuous mode
Nov 29 02:12:55 np0005539504 systemd-udevd[229924]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:12:55 np0005539504 NetworkManager[55210]: <info>  [1764400375.8932] manager: (tap07a930ef-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/161)
Nov 29 02:12:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:55Z|00338|binding|INFO|Claiming lport 07a930ef-a036-4ddf-aa57-c5d56f77847c for this chassis.
Nov 29 02:12:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:55Z|00339|binding|INFO|07a930ef-a036-4ddf-aa57-c5d56f77847c: Claiming fa:16:3e:c4:6c:2d 10.100.0.13
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.892 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.894 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.897 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 NetworkManager[55210]: <info>  [1764400375.9021] device (tap07a930ef-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:12:55 np0005539504 NetworkManager[55210]: <info>  [1764400375.9030] device (tap07a930ef-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:12:55 np0005539504 systemd-machined[153423]: New machine qemu-48-instance-0000005f.
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:55.935 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:6c:2d 10.100.0.13'], port_security=['fa:16:3e:c4:6c:2d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c188a1f4-7511-4259-992e-c9127e6a414b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ede51bf8-0086-4a77-b4a9-badf8936b8c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aab533cd-f26a-47b5-9334-c93bf39572b9, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=07a930ef-a036-4ddf-aa57-c5d56f77847c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:55Z|00340|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c ovn-installed in OVS
Nov 29 02:12:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:55Z|00341|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c up in Southbound
Nov 29 02:12:55 np0005539504 nova_compute[187152]: 2025-11-29 07:12:55.951 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539504 systemd[1]: Started Virtual Machine qemu-48-instance-0000005f.
Nov 29 02:12:56 np0005539504 podman[229992]: 2025-11-29 07:12:56.046640521 +0000 UTC m=+0.064131532 container create 6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:12:56 np0005539504 systemd[1]: Started libpod-conmon-6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41.scope.
Nov 29 02:12:56 np0005539504 podman[229992]: 2025-11-29 07:12:56.00609855 +0000 UTC m=+0.023589591 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:12:56 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:12:56 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6694a130c68b4b21cb579e9875e88879f81cc789d53f83d70260cbdc6cce4cca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:12:56 np0005539504 podman[229992]: 2025-11-29 07:12:56.148085101 +0000 UTC m=+0.165576132 container init 6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:12:56 np0005539504 podman[229992]: 2025-11-29 07:12:56.153905849 +0000 UTC m=+0.171396860 container start 6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:12:56 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [NOTICE]   (230017) : New worker (230019) forked
Nov 29 02:12:56 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [NOTICE]   (230017) : Loading success.
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.224 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c in datapath c188a1f4-7511-4259-992e-c9127e6a414b unbound from our chassis#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.228 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c188a1f4-7511-4259-992e-c9127e6a414b#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.239 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5321b8-e4bd-47df-a2e3-7648d19718fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.241 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc188a1f4-71 in ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.244 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc188a1f4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.244 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cff3d9d3-9d4f-495d-b872-6c9e3326a5c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.245 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fff62083-c083-438b-a266-b12fd4f18524]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.260 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2a797d-5a56-4366-b52c-4d66f9a931fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.277 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dfadaac0-9a51-439c-9ba6-3029d283c448]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.311 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8a21f0cf-ce1e-4c8a-9b06-03bf4ff5fde2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 NetworkManager[55210]: <info>  [1764400376.3213] manager: (tapc188a1f4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.320 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4f02c628-aad4-4709-987e-c2f7baab9187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.357 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c2971443-013d-468e-8e03-face7b7800db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.362 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[98c7f6d6-b686-4afd-8431-a24f4103e5f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.365 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400376.364603, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.365 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Started (Lifecycle Event)#033[00m
Nov 29 02:12:56 np0005539504 NetworkManager[55210]: <info>  [1764400376.3850] device (tapc188a1f4-70): carrier: link connected
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.389 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4da3b79b-34be-401e-b34c-b8757899977e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.402 187156 DEBUG nova.compute.manager [req-6e287f23-517d-4c4a-abb3-2c3fdf8f680d req-3705fb14-658e-436b-a9e6-64e5d842bc55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.403 187156 DEBUG oslo_concurrency.lockutils [req-6e287f23-517d-4c4a-abb3-2c3fdf8f680d req-3705fb14-658e-436b-a9e6-64e5d842bc55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.403 187156 DEBUG oslo_concurrency.lockutils [req-6e287f23-517d-4c4a-abb3-2c3fdf8f680d req-3705fb14-658e-436b-a9e6-64e5d842bc55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.403 187156 DEBUG oslo_concurrency.lockutils [req-6e287f23-517d-4c4a-abb3-2c3fdf8f680d req-3705fb14-658e-436b-a9e6-64e5d842bc55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.404 187156 DEBUG nova.compute.manager [req-6e287f23-517d-4c4a-abb3-2c3fdf8f680d req-3705fb14-658e-436b-a9e6-64e5d842bc55 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Processing event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.406 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.408 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7c541674-ba44-4a39-aa4d-f3df4ec22570]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc188a1f4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:0a:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583926, 'reachable_time': 35308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230045, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.411 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.414 187156 INFO nova.virt.libvirt.driver [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance spawned successfully.#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.414 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.427 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[43c9d837-cf9b-4efd-814a-65d70b0e2a25]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:a87'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583926, 'tstamp': 583926}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230046, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.443 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e194534d-2b90-48fd-8e4b-f3696cdb4d5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc188a1f4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:0a:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583926, 'reachable_time': 35308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230047, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.463 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.471 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400376.3648417, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.471 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.477 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[696b0424-a59e-451f-b6f7-e33539d773de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.493 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.497 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.522 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.522 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.523 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.523 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.523 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.524 187156 DEBUG nova.virt.libvirt.driver [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.546 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[76313ea1-53fe-409e-9499-ca8e8c060f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.548 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc188a1f4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.548 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.549 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc188a1f4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.560 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:56 np0005539504 kernel: tapc188a1f4-70: entered promiscuous mode
Nov 29 02:12:56 np0005539504 NetworkManager[55210]: <info>  [1764400376.5671] manager: (tapc188a1f4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.567 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.568 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc188a1f4-70, col_values=(('external_ids', {'iface-id': 'a383047a-7ad7-4f43-a653-f18a79d8acb1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.570 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:56 np0005539504 ovn_controller[95182]: 2025-11-29T07:12:56Z|00342|binding|INFO|Releasing lport a383047a-7ad7-4f43-a653-f18a79d8acb1 from this chassis (sb_readonly=0)
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.572 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.573 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8d47f9-11c2-4253-99bf-fc3e10df7179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.574 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-c188a1f4-7511-4259-992e-c9127e6a414b
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID c188a1f4-7511-4259-992e-c9127e6a414b
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:12:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:12:56.576 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'env', 'PROCESS_TAG=haproxy-c188a1f4-7511-4259-992e-c9127e6a414b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c188a1f4-7511-4259-992e-c9127e6a414b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.584 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.623 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.624 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400376.4107854, 084a0f8e-19b7-4b24-a503-c015b26addbc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.624 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.758 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.763 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.795 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.875 187156 INFO nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 27.72 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:12:56 np0005539504 nova_compute[187152]: 2025-11-29 07:12:56.876 187156 DEBUG nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:57 np0005539504 podman[230079]: 2025-11-29 07:12:56.958244449 +0000 UTC m=+0.025999232 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.552 187156 DEBUG nova.network.neutron [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.553 187156 DEBUG nova.network.neutron [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.622 187156 DEBUG nova.network.neutron [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updated VIF entry in instance network info cache for port 60943dec-d420-449f-abc3-233df163ebed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.622 187156 DEBUG nova.network.neutron [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:12:57 np0005539504 podman[230079]: 2025-11-29 07:12:57.745517985 +0000 UTC m=+0.813272748 container create 86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.801 187156 DEBUG oslo_concurrency.lockutils [req-3654a7ee-cfaa-48ef-a0bb-46f39a319277 req-b31fbfa7-c633-4484-b340-39d2ff79621d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:57 np0005539504 nova_compute[187152]: 2025-11-29 07:12:57.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:57 np0005539504 systemd[1]: Started libpod-conmon-86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46.scope.
Nov 29 02:12:58 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:12:58 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07ab13aebc6b15a9b1ef25f29e5eb98e69b8a59774f670632a1b62514907932e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:12:58 np0005539504 podman[230079]: 2025-11-29 07:12:58.332400323 +0000 UTC m=+1.400155086 container init 86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 02:12:58 np0005539504 podman[230079]: 2025-11-29 07:12:58.337668578 +0000 UTC m=+1.405423351 container start 86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:12:58 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [NOTICE]   (230099) : New worker (230101) forked
Nov 29 02:12:58 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [NOTICE]   (230099) : Loading success.
Nov 29 02:12:58 np0005539504 nova_compute[187152]: 2025-11-29 07:12:58.437 187156 DEBUG oslo_concurrency.lockutils [req-94cbf4b8-eeda-46e1-8078-87c972dada76 req-6a000478-a662-40d6-9d6b-5f90b45c953d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:12:58 np0005539504 nova_compute[187152]: 2025-11-29 07:12:58.716 187156 INFO nova.compute.manager [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 33.96 seconds to build instance.#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.364 187156 DEBUG nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.365 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.365 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.365 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.366 187156 DEBUG nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.366 187156 WARNING nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received unexpected event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed for instance with vm_state active and task_state None.#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.366 187156 DEBUG nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.367 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.367 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.367 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.368 187156 DEBUG nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Processing event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.368 187156 DEBUG nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.368 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.369 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.369 187156 DEBUG oslo_concurrency.lockutils [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.369 187156 DEBUG nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.370 187156 WARNING nova.compute.manager [req-98424c78-7cad-4e8f-abb5-2089d1ec2a13 req-1da954c9-9d23-4fa3-bcae-5f085db37fcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.370 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.375 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400379.3755515, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.376 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.378 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.382 187156 INFO nova.virt.libvirt.driver [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance spawned successfully.#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.382 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:12:59 np0005539504 podman[230110]: 2025-11-29 07:12:59.739599238 +0000 UTC m=+0.087116766 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:12:59 np0005539504 podman[230111]: 2025-11-29 07:12:59.787801445 +0000 UTC m=+0.126322544 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.855 187156 DEBUG oslo_concurrency.lockutils [None req-644556ac-2b11-409a-b91b-e93a819c35a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 47.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.882 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.890 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.896 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.897 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.898 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.898 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.899 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.900 187156 DEBUG nova.virt.libvirt.driver [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:59 np0005539504 nova_compute[187152]: 2025-11-29 07:12:59.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:13:00 np0005539504 nova_compute[187152]: 2025-11-29 07:13:00.032 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:13:00 np0005539504 nova_compute[187152]: 2025-11-29 07:13:00.219 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:00 np0005539504 nova_compute[187152]: 2025-11-29 07:13:00.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:01 np0005539504 nova_compute[187152]: 2025-11-29 07:13:01.081 187156 INFO nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Took 35.55 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:13:01 np0005539504 nova_compute[187152]: 2025-11-29 07:13:01.081 187156 DEBUG nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:13:02 np0005539504 nova_compute[187152]: 2025-11-29 07:13:02.923 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:03 np0005539504 nova_compute[187152]: 2025-11-29 07:13:03.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:03 np0005539504 nova_compute[187152]: 2025-11-29 07:13:03.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:13:03 np0005539504 nova_compute[187152]: 2025-11-29 07:13:03.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:13:05 np0005539504 nova_compute[187152]: 2025-11-29 07:13:05.222 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:06 np0005539504 podman[230160]: 2025-11-29 07:13:06.724668738 +0000 UTC m=+0.064761418 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:13:07 np0005539504 nova_compute[187152]: 2025-11-29 07:13:07.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:09 np0005539504 nova_compute[187152]: 2025-11-29 07:13:09.873 187156 INFO nova.compute.manager [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Took 71.02 seconds to build instance.#033[00m
Nov 29 02:13:10 np0005539504 nova_compute[187152]: 2025-11-29 07:13:10.223 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:12 np0005539504 nova_compute[187152]: 2025-11-29 07:13:12.931 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:13 np0005539504 podman[230191]: 2025-11-29 07:13:13.067915672 +0000 UTC m=+0.098892517 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:13:15 np0005539504 nova_compute[187152]: 2025-11-29 07:13:15.226 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:15 np0005539504 nova_compute[187152]: 2025-11-29 07:13:15.322 187156 DEBUG oslo_concurrency.lockutils [None req-bf2a2bdf-596a-4611-ac4e-3fc2238042b5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 92.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:17 np0005539504 nova_compute[187152]: 2025-11-29 07:13:17.214 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:13:17 np0005539504 nova_compute[187152]: 2025-11-29 07:13:17.214 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:13:17 np0005539504 nova_compute[187152]: 2025-11-29 07:13:17.214 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:13:17 np0005539504 nova_compute[187152]: 2025-11-29 07:13:17.215 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:13:17 np0005539504 nova_compute[187152]: 2025-11-29 07:13:17.934 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:20Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:06:9e 10.100.0.9
Nov 29 02:13:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:20Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:06:9e 10.100.0.9
Nov 29 02:13:20 np0005539504 nova_compute[187152]: 2025-11-29 07:13:20.228 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:20 np0005539504 NetworkManager[55210]: <info>  [1764400400.4530] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Nov 29 02:13:20 np0005539504 nova_compute[187152]: 2025-11-29 07:13:20.452 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:20 np0005539504 NetworkManager[55210]: <info>  [1764400400.4544] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Nov 29 02:13:20 np0005539504 nova_compute[187152]: 2025-11-29 07:13:20.566 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:20Z|00343|binding|INFO|Releasing lport a383047a-7ad7-4f43-a653-f18a79d8acb1 from this chassis (sb_readonly=0)
Nov 29 02:13:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:20Z|00344|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:13:20 np0005539504 nova_compute[187152]: 2025-11-29 07:13:20.582 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:21 np0005539504 nova_compute[187152]: 2025-11-29 07:13:21.366 187156 DEBUG nova.compute.manager [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-changed-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:13:21 np0005539504 nova_compute[187152]: 2025-11-29 07:13:21.367 187156 DEBUG nova.compute.manager [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing instance network info cache due to event network-changed-60943dec-d420-449f-abc3-233df163ebed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:13:21 np0005539504 nova_compute[187152]: 2025-11-29 07:13:21.367 187156 DEBUG oslo_concurrency.lockutils [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:13:21 np0005539504 nova_compute[187152]: 2025-11-29 07:13:21.367 187156 DEBUG oslo_concurrency.lockutils [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:13:21 np0005539504 nova_compute[187152]: 2025-11-29 07:13:21.368 187156 DEBUG nova.network.neutron [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing network info cache for port 60943dec-d420-449f-abc3-233df163ebed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:13:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:22Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c4:6c:2d 10.100.0.13
Nov 29 02:13:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:22Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c4:6c:2d 10.100.0.13
Nov 29 02:13:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:13:22.930 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:13:22.932 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:13:22.932 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:22 np0005539504 nova_compute[187152]: 2025-11-29 07:13:22.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:23 np0005539504 nova_compute[187152]: 2025-11-29 07:13:23.853 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:13:23 np0005539504 nova_compute[187152]: 2025-11-29 07:13:23.967 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:13:23 np0005539504 nova_compute[187152]: 2025-11-29 07:13:23.967 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:13:23 np0005539504 nova_compute[187152]: 2025-11-29 07:13:23.968 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.092 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.092 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 084a0f8e-19b7-4b24-a503-c015b26addbc _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.092 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.093 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.093 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.093 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.094 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.286 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.286 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.287 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.287 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.302 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.322 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.443 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.507 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.508 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.580 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.590 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.656 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.657 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.722 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.922 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.924 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5344MB free_disk=73.13521194458008GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.924 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:24 np0005539504 nova_compute[187152]: 2025-11-29 07:13:24.924 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.162 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.163 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 084a0f8e-19b7-4b24-a503-c015b26addbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.163 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.163 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.230 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.352 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.382 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.472 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.473 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:25 np0005539504 nova_compute[187152]: 2025-11-29 07:13:25.713 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:13:25.714 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:13:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:13:25.715 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:13:25 np0005539504 podman[230251]: 2025-11-29 07:13:25.71800726 +0000 UTC m=+0.060966130 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:13:25 np0005539504 podman[230252]: 2025-11-29 07:13:25.725328546 +0000 UTC m=+0.067818864 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 29 02:13:25 np0005539504 podman[230253]: 2025-11-29 07:13:25.759070783 +0000 UTC m=+0.092823355 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:13:26 np0005539504 nova_compute[187152]: 2025-11-29 07:13:26.713 187156 DEBUG nova.network.neutron [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updated VIF entry in instance network info cache for port 60943dec-d420-449f-abc3-233df163ebed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:13:26 np0005539504 nova_compute[187152]: 2025-11-29 07:13:26.714 187156 DEBUG nova.network.neutron [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:13:26 np0005539504 nova_compute[187152]: 2025-11-29 07:13:26.773 187156 DEBUG oslo_concurrency.lockutils [req-75bed24d-f0b1-413f-a3d8-594f3464b16c req-7643c9b6-0080-4475-87a7-9c7e39562da8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:13:27 np0005539504 nova_compute[187152]: 2025-11-29 07:13:27.363 187156 DEBUG nova.compute.manager [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:13:27 np0005539504 nova_compute[187152]: 2025-11-29 07:13:27.363 187156 DEBUG nova.compute.manager [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:13:27 np0005539504 nova_compute[187152]: 2025-11-29 07:13:27.364 187156 DEBUG oslo_concurrency.lockutils [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:13:27 np0005539504 nova_compute[187152]: 2025-11-29 07:13:27.364 187156 DEBUG oslo_concurrency.lockutils [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:13:27 np0005539504 nova_compute[187152]: 2025-11-29 07:13:27.364 187156 DEBUG nova.network.neutron [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:13:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:13:27.718 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:13:27 np0005539504 nova_compute[187152]: 2025-11-29 07:13:27.940 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:30 np0005539504 nova_compute[187152]: 2025-11-29 07:13:30.233 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:30 np0005539504 podman[230314]: 2025-11-29 07:13:30.734736851 +0000 UTC m=+0.071432011 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:13:30 np0005539504 podman[230315]: 2025-11-29 07:13:30.763980936 +0000 UTC m=+0.095819566 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:13:32 np0005539504 nova_compute[187152]: 2025-11-29 07:13:32.379 187156 INFO nova.compute.manager [None req-6a9482e5-d603-40ea-b4d2-fee4c4549a2c bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Get console output#033[00m
Nov 29 02:13:32 np0005539504 nova_compute[187152]: 2025-11-29 07:13:32.387 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:13:32 np0005539504 nova_compute[187152]: 2025-11-29 07:13:32.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:35 np0005539504 nova_compute[187152]: 2025-11-29 07:13:35.235 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:36 np0005539504 nova_compute[187152]: 2025-11-29 07:13:36.717 187156 DEBUG nova.network.neutron [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:13:36 np0005539504 nova_compute[187152]: 2025-11-29 07:13:36.718 187156 DEBUG nova.network.neutron [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:13:36 np0005539504 nova_compute[187152]: 2025-11-29 07:13:36.939 187156 DEBUG oslo_concurrency.lockutils [req-876695ce-2852-4c25-aaaa-fc527e37c145 req-90204bba-d6d4-4f90-a806-5fd9ccefb1bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:13:37 np0005539504 podman[230364]: 2025-11-29 07:13:37.720279332 +0000 UTC m=+0.067591598 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Nov 29 02:13:37 np0005539504 nova_compute[187152]: 2025-11-29 07:13:37.741 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:37 np0005539504 nova_compute[187152]: 2025-11-29 07:13:37.944 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:39 np0005539504 nova_compute[187152]: 2025-11-29 07:13:39.114 187156 INFO nova.compute.manager [None req-c9e608df-5091-4417-9aeb-efcbbcda5745 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Get console output#033[00m
Nov 29 02:13:39 np0005539504 nova_compute[187152]: 2025-11-29 07:13:39.123 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:13:39 np0005539504 nova_compute[187152]: 2025-11-29 07:13:39.536 187156 DEBUG nova.compute.manager [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:13:40 np0005539504 nova_compute[187152]: 2025-11-29 07:13:40.048 187156 INFO nova.compute.manager [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] instance snapshotting#033[00m
Nov 29 02:13:40 np0005539504 nova_compute[187152]: 2025-11-29 07:13:40.050 187156 DEBUG nova.objects.instance [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'flavor' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:13:40 np0005539504 nova_compute[187152]: 2025-11-29 07:13:40.237 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:40 np0005539504 nova_compute[187152]: 2025-11-29 07:13:40.879 187156 INFO nova.virt.libvirt.driver [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Beginning live snapshot process#033[00m
Nov 29 02:13:41 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.376 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.438 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.439 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.508 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.524 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.582 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.583 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpfrqwnis8/6cdc013a748d4a8b9553f89f15eb2f66.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.622 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpfrqwnis8/6cdc013a748d4a8b9553f89f15eb2f66.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.623 187156 INFO nova.virt.libvirt.driver [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:13:41 np0005539504 nova_compute[187152]: 2025-11-29 07:13:41.699 187156 DEBUG nova.virt.libvirt.guest [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:13:42 np0005539504 nova_compute[187152]: 2025-11-29 07:13:42.203 187156 DEBUG nova.virt.libvirt.guest [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:13:42 np0005539504 nova_compute[187152]: 2025-11-29 07:13:42.206 187156 INFO nova.virt.libvirt.driver [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:13:42 np0005539504 nova_compute[187152]: 2025-11-29 07:13:42.243 187156 DEBUG nova.privsep.utils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:13:42 np0005539504 nova_compute[187152]: 2025-11-29 07:13:42.244 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpfrqwnis8/6cdc013a748d4a8b9553f89f15eb2f66.delta /var/lib/nova/instances/snapshots/tmpfrqwnis8/6cdc013a748d4a8b9553f89f15eb2f66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:42 np0005539504 nova_compute[187152]: 2025-11-29 07:13:42.946 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:42 np0005539504 nova_compute[187152]: 2025-11-29 07:13:42.996 187156 DEBUG oslo_concurrency.processutils [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpfrqwnis8/6cdc013a748d4a8b9553f89f15eb2f66.delta /var/lib/nova/instances/snapshots/tmpfrqwnis8/6cdc013a748d4a8b9553f89f15eb2f66" returned: 0 in 0.752s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:43 np0005539504 nova_compute[187152]: 2025-11-29 07:13:43.003 187156 INFO nova.virt.libvirt.driver [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:13:43 np0005539504 podman[230420]: 2025-11-29 07:13:43.733608965 +0000 UTC m=+0.070008983 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:13:45 np0005539504 nova_compute[187152]: 2025-11-29 07:13:45.243 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:47 np0005539504 nova_compute[187152]: 2025-11-29 07:13:47.950 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:48Z|00345|binding|INFO|Releasing lport a383047a-7ad7-4f43-a653-f18a79d8acb1 from this chassis (sb_readonly=0)
Nov 29 02:13:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:13:48Z|00346|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:13:48 np0005539504 nova_compute[187152]: 2025-11-29 07:13:48.905 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:50 np0005539504 nova_compute[187152]: 2025-11-29 07:13:50.247 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:50 np0005539504 nova_compute[187152]: 2025-11-29 07:13:50.445 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:50 np0005539504 nova_compute[187152]: 2025-11-29 07:13:50.700 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:51 np0005539504 nova_compute[187152]: 2025-11-29 07:13:51.636 187156 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:13:51 np0005539504 nova_compute[187152]: 2025-11-29 07:13:51.637 187156 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:13:51 np0005539504 nova_compute[187152]: 2025-11-29 07:13:51.637 187156 DEBUG nova.network.neutron [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:13:51 np0005539504 nova_compute[187152]: 2025-11-29 07:13:51.738 187156 INFO nova.virt.libvirt.driver [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Snapshot image upload complete#033[00m
Nov 29 02:13:51 np0005539504 nova_compute[187152]: 2025-11-29 07:13:51.739 187156 INFO nova.compute.manager [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 11.64 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:13:52 np0005539504 nova_compute[187152]: 2025-11-29 07:13:52.580 187156 DEBUG nova.compute.manager [None req-79b4fc76-dd55-4518-9690-d61bdaacf220 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 29 02:13:52 np0005539504 nova_compute[187152]: 2025-11-29 07:13:52.952 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:55 np0005539504 nova_compute[187152]: 2025-11-29 07:13:55.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:56 np0005539504 podman[230442]: 2025-11-29 07:13:56.727568912 +0000 UTC m=+0.050456557 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:13:56 np0005539504 podman[230440]: 2025-11-29 07:13:56.728349962 +0000 UTC m=+0.062503360 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:13:56 np0005539504 nova_compute[187152]: 2025-11-29 07:13:56.767 187156 DEBUG nova.network.neutron [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:13:56 np0005539504 podman[230441]: 2025-11-29 07:13:56.770706601 +0000 UTC m=+0.102523315 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 02:13:56 np0005539504 nova_compute[187152]: 2025-11-29 07:13:56.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:56 np0005539504 nova_compute[187152]: 2025-11-29 07:13:56.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:56 np0005539504 nova_compute[187152]: 2025-11-29 07:13:56.937 187156 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.326 187156 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.327 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Creating file /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/e0e9b1b430674284b302e1b6b9e33436.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.327 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/e0e9b1b430674284b302e1b6b9e33436.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.727 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/e0e9b1b430674284b302e1b6b9e33436.tmp" returned: 1 in 0.399s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.728 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/e0e9b1b430674284b302e1b6b9e33436.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.729 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Creating directory /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.729 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.940 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.941 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:13:57 np0005539504 nova_compute[187152]: 2025-11-29 07:13:57.945 187156 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:13:58 np0005539504 nova_compute[187152]: 2025-11-29 07:13:58.015 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:58 np0005539504 nova_compute[187152]: 2025-11-29 07:13:58.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:59 np0005539504 nova_compute[187152]: 2025-11-29 07:13:59.058 187156 DEBUG nova.compute.manager [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:13:59 np0005539504 nova_compute[187152]: 2025-11-29 07:13:59.697 187156 INFO nova.compute.manager [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] instance snapshotting#033[00m
Nov 29 02:13:59 np0005539504 nova_compute[187152]: 2025-11-29 07:13:59.698 187156 DEBUG nova.objects.instance [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'flavor' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.254 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.703 187156 INFO nova.virt.libvirt.driver [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Beginning live snapshot process#033[00m
Nov 29 02:14:00 np0005539504 kernel: tap07a930ef-a0 (unregistering): left promiscuous mode
Nov 29 02:14:00 np0005539504 NetworkManager[55210]: <info>  [1764400440.7621] device (tap07a930ef-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:14:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:00Z|00347|binding|INFO|Releasing lport 07a930ef-a036-4ddf-aa57-c5d56f77847c from this chassis (sb_readonly=0)
Nov 29 02:14:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:00Z|00348|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c down in Southbound
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.769 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:00Z|00349|binding|INFO|Removing iface tap07a930ef-a0 ovn-installed in OVS
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.772 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.783 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:00.821 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:6c:2d 10.100.0.13'], port_security=['fa:16:3e:c4:6c:2d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c188a1f4-7511-4259-992e-c9127e6a414b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ede51bf8-0086-4a77-b4a9-badf8936b8c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aab533cd-f26a-47b5-9334-c93bf39572b9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=07a930ef-a036-4ddf-aa57-c5d56f77847c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:00.823 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c in datapath c188a1f4-7511-4259-992e-c9127e6a414b unbound from our chassis#033[00m
Nov 29 02:14:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:00.824 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c188a1f4-7511-4259-992e-c9127e6a414b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:14:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:00.827 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[24a0acf0-4850-4647-88e0-e59cf332c190]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:00.828 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b namespace which is not needed anymore#033[00m
Nov 29 02:14:00 np0005539504 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Nov 29 02:14:00 np0005539504 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005f.scope: Consumed 16.166s CPU time.
Nov 29 02:14:00 np0005539504 systemd-machined[153423]: Machine qemu-48-instance-0000005f terminated.
Nov 29 02:14:00 np0005539504 podman[230507]: 2025-11-29 07:14:00.846950829 +0000 UTC m=+0.060959580 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:14:00 np0005539504 podman[230510]: 2025-11-29 07:14:00.885124874 +0000 UTC m=+0.094893100 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:00 np0005539504 nova_compute[187152]: 2025-11-29 07:14:00.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:14:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [NOTICE]   (230099) : haproxy version is 2.8.14-c23fe91
Nov 29 02:14:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [NOTICE]   (230099) : path to executable is /usr/sbin/haproxy
Nov 29 02:14:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [WARNING]  (230099) : Exiting Master process...
Nov 29 02:14:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [WARNING]  (230099) : Exiting Master process...
Nov 29 02:14:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [ALERT]    (230099) : Current worker (230101) exited with code 143 (Terminated)
Nov 29 02:14:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230095]: [WARNING]  (230099) : All workers exited. Exiting... (0)
Nov 29 02:14:01 np0005539504 systemd[1]: libpod-86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46.scope: Deactivated successfully.
Nov 29 02:14:01 np0005539504 podman[230577]: 2025-11-29 07:14:01.029797553 +0000 UTC m=+0.114844618 container died 86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.055 187156 INFO nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.060 187156 INFO nova.virt.libvirt.driver [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance destroyed successfully.#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.061 187156 DEBUG nova.virt.libvirt.vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:13:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:13:50Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.061 187156 DEBUG nova.network.os_vif_util [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.062 187156 DEBUG nova.network.os_vif_util [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.063 187156 DEBUG os_vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.065 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.065 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07a930ef-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.068 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.073 187156 INFO os_vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0')#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.077 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46-userdata-shm.mount: Deactivated successfully.
Nov 29 02:14:01 np0005539504 systemd[1]: var-lib-containers-storage-overlay-07ab13aebc6b15a9b1ef25f29e5eb98e69b8a59774f670632a1b62514907932e-merged.mount: Deactivated successfully.
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.138 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.140 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 podman[230577]: 2025-11-29 07:14:01.184556441 +0000 UTC m=+0.269603536 container cleanup 86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:14:01 np0005539504 systemd[1]: libpod-conmon-86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46.scope: Deactivated successfully.
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.195 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.197 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk to 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.197 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 podman[230628]: 2025-11-29 07:14:01.330726829 +0000 UTC m=+0.123856859 container remove 86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.336 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f63306d9-46b7-4437-84d4-5e78a7e9454a]: (4, ('Sat Nov 29 07:14:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b (86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46)\n86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46\nSat Nov 29 07:14:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b (86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46)\n86aa734a569a5e1a8d04c29d42a5bc4345bfbde56ee8e56c59ad334b569e4a46\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.337 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bb93769e-0941-40cd-9fb1-478803ce80fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.339 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc188a1f4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:01 np0005539504 kernel: tapc188a1f4-70: left promiscuous mode
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.352 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.356 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a02486a6-d4a6-4921-b97c-e2f22cdc4ce4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.369 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[70899a9f-cbcb-417b-b3f0-9aa0ab6918c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.370 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b89d4dad-f664-45db-b13c-019f5cd81bc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.386 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[875fd466-5b0d-42e0-a193-04b9a9fd54f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583918, 'reachable_time': 25606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230645, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.389 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:14:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:01.390 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[a521d944-5f6e-4b78-a69c-058d2a0409ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:01 np0005539504 systemd[1]: run-netns-ovnmeta\x2dc188a1f4\x2d7511\x2d4259\x2d992e\x2dc9127e6a414b.mount: Deactivated successfully.
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.503 187156 DEBUG nova.compute.manager [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.504 187156 DEBUG oslo_concurrency.lockutils [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.504 187156 DEBUG oslo_concurrency.lockutils [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.504 187156 DEBUG oslo_concurrency.lockutils [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.505 187156 DEBUG nova.compute.manager [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.505 187156 WARNING nova.compute.manager [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:14:01 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.661 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.722 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.723 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.742 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -r /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.744 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.744 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk.config 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.786 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.798 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.857 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.858 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpwqh27i8f/a85ccff6fed9495cb2366c37713403d1.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.964 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -C -r /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk.config 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.966 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:14:01 np0005539504 nova_compute[187152]: 2025-11-29 07:14:01.967 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk.info 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.100 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpwqh27i8f/a85ccff6fed9495cb2366c37713403d1.delta 1073741824" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.102 187156 INFO nova.virt.libvirt.driver [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.151 187156 DEBUG nova.virt.libvirt.guest [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.190 187156 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -C -r /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_resize/disk.info 192.168.122.102:/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.469 187156 DEBUG neutronclient.v2_0.client [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 07a930ef-a036-4ddf-aa57-c5d56f77847c for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.647 187156 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.647 187156 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.647 187156 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.654 187156 DEBUG nova.virt.libvirt.guest [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.658 187156 INFO nova.virt.libvirt.driver [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.712 187156 DEBUG nova.privsep.utils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:14:02 np0005539504 nova_compute[187152]: 2025-11-29 07:14:02.713 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpwqh27i8f/a85ccff6fed9495cb2366c37713403d1.delta /var/lib/nova/instances/snapshots/tmpwqh27i8f/a85ccff6fed9495cb2366c37713403d1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:03 np0005539504 nova_compute[187152]: 2025-11-29 07:14:03.239 187156 DEBUG oslo_concurrency.processutils [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpwqh27i8f/a85ccff6fed9495cb2366c37713403d1.delta /var/lib/nova/instances/snapshots/tmpwqh27i8f/a85ccff6fed9495cb2366c37713403d1" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:03 np0005539504 nova_compute[187152]: 2025-11-29 07:14:03.244 187156 INFO nova.virt.libvirt.driver [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:14:03 np0005539504 nova_compute[187152]: 2025-11-29 07:14:03.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.027 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.027 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.028 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.028 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.198 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000005f, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.204 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.239 187156 DEBUG nova.compute.manager [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.239 187156 DEBUG oslo_concurrency.lockutils [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.240 187156 DEBUG oslo_concurrency.lockutils [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.240 187156 DEBUG oslo_concurrency.lockutils [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.240 187156 DEBUG nova.compute.manager [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.241 187156 WARNING nova.compute.manager [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.257 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.258 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.317 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.507 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.509 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5531MB free_disk=73.08561325073242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.509 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.510 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.646 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration for instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.724 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating resource usage from migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.724 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Starting to track outgoing migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1 with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.761 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 084a0f8e-19b7-4b24-a503-c015b26addbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.762 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.762 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.762 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.785 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.853 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:14:04 np0005539504 nova_compute[187152]: 2025-11-29 07:14:04.854 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.168 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.220 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.256 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.337 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.384 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.442 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.442 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:05.472 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:05 np0005539504 nova_compute[187152]: 2025-11-29 07:14:05.473 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:05.473 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:14:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:05.474 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:06 np0005539504 nova_compute[187152]: 2025-11-29 07:14:06.068 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.257 187156 DEBUG nova.compute.manager [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.257 187156 DEBUG nova.compute.manager [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.258 187156 DEBUG oslo_concurrency.lockutils [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.258 187156 DEBUG oslo_concurrency.lockutils [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.258 187156 DEBUG nova.network.neutron [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.442 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.443 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.476 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.477 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:07 np0005539504 nova_compute[187152]: 2025-11-29 07:14:07.477 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:14:08 np0005539504 podman[230700]: 2025-11-29 07:14:08.735199437 +0000 UTC m=+0.073189778 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:14:08 np0005539504 nova_compute[187152]: 2025-11-29 07:14:08.994 187156 INFO nova.virt.libvirt.driver [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Snapshot image upload complete#033[00m
Nov 29 02:14:08 np0005539504 nova_compute[187152]: 2025-11-29 07:14:08.998 187156 INFO nova.compute.manager [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 9.20 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:14:09 np0005539504 nova_compute[187152]: 2025-11-29 07:14:09.689 187156 DEBUG nova.compute.manager [None req-540d9e0f-9244-4894-9d5a-d0616ae09a86 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 29 02:14:09 np0005539504 nova_compute[187152]: 2025-11-29 07:14:09.819 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.258 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.329 187156 DEBUG nova.network.neutron [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.329 187156 DEBUG nova.network.neutron [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.434 187156 DEBUG oslo_concurrency.lockutils [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.437 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.465 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.466 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:14:10 np0005539504 nova_compute[187152]: 2025-11-29 07:14:10.955 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:11 np0005539504 nova_compute[187152]: 2025-11-29 07:14:11.069 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539504 nova_compute[187152]: 2025-11-29 07:14:11.792 187156 DEBUG nova.compute.manager [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:11 np0005539504 nova_compute[187152]: 2025-11-29 07:14:11.944 187156 INFO nova.compute.manager [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] instance snapshotting#033[00m
Nov 29 02:14:11 np0005539504 nova_compute[187152]: 2025-11-29 07:14:11.946 187156 DEBUG nova.objects.instance [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'flavor' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.263 187156 INFO nova.virt.libvirt.driver [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Beginning live snapshot process#033[00m
Nov 29 02:14:12 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.631 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.699 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.700 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.760 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json -f qcow2" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.772 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.835 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.836 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp8bskxo5g/2cb249693d6b4234b445f5ed964aa6be.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.872 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp8bskxo5g/2cb249693d6b4234b445f5ed964aa6be.delta 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.873 187156 INFO nova.virt.libvirt.driver [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:14:12 np0005539504 nova_compute[187152]: 2025-11-29 07:14:12.923 187156 DEBUG nova.virt.libvirt.guest [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.252 187156 DEBUG nova.compute.manager [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.253 187156 DEBUG oslo_concurrency.lockutils [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.253 187156 DEBUG oslo_concurrency.lockutils [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.254 187156 DEBUG oslo_concurrency.lockutils [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.254 187156 DEBUG nova.compute.manager [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.254 187156 WARNING nova.compute.manager [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.427 187156 DEBUG nova.virt.libvirt.guest [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.431 187156 INFO nova.virt.libvirt.driver [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.466 187156 DEBUG nova.privsep.utils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.467 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp8bskxo5g/2cb249693d6b4234b445f5ed964aa6be.delta /var/lib/nova/instances/snapshots/tmp8bskxo5g/2cb249693d6b4234b445f5ed964aa6be execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.985 187156 DEBUG oslo_concurrency.processutils [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp8bskxo5g/2cb249693d6b4234b445f5ed964aa6be.delta /var/lib/nova/instances/snapshots/tmp8bskxo5g/2cb249693d6b4234b445f5ed964aa6be" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:13 np0005539504 nova_compute[187152]: 2025-11-29 07:14:13.992 187156 INFO nova.virt.libvirt.driver [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:14:14 np0005539504 podman[230760]: 2025-11-29 07:14:14.744294566 +0000 UTC m=+0.087219476 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.269 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.600 187156 DEBUG nova.compute.manager [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.600 187156 DEBUG oslo_concurrency.lockutils [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.600 187156 DEBUG oslo_concurrency.lockutils [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.601 187156 DEBUG oslo_concurrency.lockutils [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.601 187156 DEBUG nova.compute.manager [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:15 np0005539504 nova_compute[187152]: 2025-11-29 07:14:15.601 187156 WARNING nova.compute.manager [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:16 np0005539504 nova_compute[187152]: 2025-11-29 07:14:16.055 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400441.053946, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:16 np0005539504 nova_compute[187152]: 2025-11-29 07:14:16.056 187156 INFO nova.compute.manager [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:14:16 np0005539504 nova_compute[187152]: 2025-11-29 07:14:16.071 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:16 np0005539504 nova_compute[187152]: 2025-11-29 07:14:16.262 187156 DEBUG nova.compute.manager [None req-4dc93e70-287e-4dc6-9373-0c01f641e0aa - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:16 np0005539504 nova_compute[187152]: 2025-11-29 07:14:16.267 187156 DEBUG nova.compute.manager [None req-4dc93e70-287e-4dc6-9373-0c01f641e0aa - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:16 np0005539504 nova_compute[187152]: 2025-11-29 07:14:16.424 187156 INFO nova.compute.manager [None req-4dc93e70-287e-4dc6-9373-0c01f641e0aa - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 02:14:19 np0005539504 nova_compute[187152]: 2025-11-29 07:14:19.463 187156 INFO nova.virt.libvirt.driver [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Snapshot image upload complete#033[00m
Nov 29 02:14:19 np0005539504 nova_compute[187152]: 2025-11-29 07:14:19.464 187156 INFO nova.compute.manager [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 7.46 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:14:19 np0005539504 nova_compute[187152]: 2025-11-29 07:14:19.899 187156 DEBUG nova.compute.manager [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 29 02:14:19 np0005539504 nova_compute[187152]: 2025-11-29 07:14:19.900 187156 DEBUG nova.compute.manager [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458#033[00m
Nov 29 02:14:19 np0005539504 nova_compute[187152]: 2025-11-29 07:14:19.901 187156 DEBUG nova.compute.manager [None req-8f04f31a-33c0-4ca7-871e-30bc5577b297 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Deleting image d4c1eba7-3313-43f2-b9f8-2891153fa578 _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463#033[00m
Nov 29 02:14:20 np0005539504 nova_compute[187152]: 2025-11-29 07:14:20.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:21 np0005539504 nova_compute[187152]: 2025-11-29 07:14:21.074 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:22.931 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:22.931 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:22.932 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:23 np0005539504 nova_compute[187152]: 2025-11-29 07:14:23.532 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.272 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.879 187156 DEBUG nova.compute.manager [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.880 187156 DEBUG oslo_concurrency.lockutils [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.880 187156 DEBUG oslo_concurrency.lockutils [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.880 187156 DEBUG oslo_concurrency.lockutils [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.880 187156 DEBUG nova.compute.manager [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:25 np0005539504 nova_compute[187152]: 2025-11-29 07:14:25.881 187156 WARNING nova.compute.manager [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:26 np0005539504 nova_compute[187152]: 2025-11-29 07:14:26.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:26 np0005539504 nova_compute[187152]: 2025-11-29 07:14:26.732 187156 INFO nova.compute.manager [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Swapping old allocation on dict_keys(['1c526389-06f6-4ffd-8e90-a84c6c39f0bc']) held by migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1 for instance#033[00m
Nov 29 02:14:26 np0005539504 nova_compute[187152]: 2025-11-29 07:14:26.797 187156 DEBUG nova.scheduler.client.report [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Overwriting current allocation {'allocations': {'2d55ea77-8118-4f48-9bb5-d62d10fd53c0': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 61}}, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'consumer_generation': 1} on consumer 5818027f-a5b1-465a-a6e2-f0c8f0de8154 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Nov 29 02:14:27 np0005539504 nova_compute[187152]: 2025-11-29 07:14:27.254 187156 INFO nova.network.neutron [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating port 07a930ef-a036-4ddf-aa57-c5d56f77847c with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:14:27 np0005539504 podman[230780]: 2025-11-29 07:14:27.729855121 +0000 UTC m=+0.070666439 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:14:27 np0005539504 podman[230781]: 2025-11-29 07:14:27.746473638 +0000 UTC m=+0.080594067 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Nov 29 02:14:27 np0005539504 podman[230782]: 2025-11-29 07:14:27.763457554 +0000 UTC m=+0.091899861 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:14:28 np0005539504 nova_compute[187152]: 2025-11-29 07:14:28.533 187156 DEBUG nova.compute.manager [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:28 np0005539504 nova_compute[187152]: 2025-11-29 07:14:28.533 187156 DEBUG oslo_concurrency.lockutils [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:28 np0005539504 nova_compute[187152]: 2025-11-29 07:14:28.533 187156 DEBUG oslo_concurrency.lockutils [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:28 np0005539504 nova_compute[187152]: 2025-11-29 07:14:28.534 187156 DEBUG oslo_concurrency.lockutils [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:28 np0005539504 nova_compute[187152]: 2025-11-29 07:14:28.534 187156 DEBUG nova.compute.manager [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:28 np0005539504 nova_compute[187152]: 2025-11-29 07:14:28.534 187156 WARNING nova.compute.manager [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:29 np0005539504 nova_compute[187152]: 2025-11-29 07:14:29.501 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:29 np0005539504 nova_compute[187152]: 2025-11-29 07:14:29.502 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:29 np0005539504 nova_compute[187152]: 2025-11-29 07:14:29.502 187156 DEBUG nova.network.neutron [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:14:29 np0005539504 nova_compute[187152]: 2025-11-29 07:14:29.721 187156 DEBUG nova.compute.manager [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:29 np0005539504 nova_compute[187152]: 2025-11-29 07:14:29.722 187156 DEBUG nova.compute.manager [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:14:29 np0005539504 nova_compute[187152]: 2025-11-29 07:14:29.722 187156 DEBUG oslo_concurrency.lockutils [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:30 np0005539504 nova_compute[187152]: 2025-11-29 07:14:30.314 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:31 np0005539504 nova_compute[187152]: 2025-11-29 07:14:31.084 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:31 np0005539504 podman[230841]: 2025-11-29 07:14:31.731227583 +0000 UTC m=+0.059068824 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:14:31 np0005539504 podman[230842]: 2025-11-29 07:14:31.79344957 +0000 UTC m=+0.117009768 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:14:33 np0005539504 nova_compute[187152]: 2025-11-29 07:14:33.850 187156 DEBUG nova.network.neutron [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.822 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.823 187156 DEBUG nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.826 187156 DEBUG oslo_concurrency.lockutils [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.827 187156 DEBUG nova.network.neutron [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.837 187156 DEBUG nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start _get_guest_xml network_info=[{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.844 187156 WARNING nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.854 187156 DEBUG nova.virt.libvirt.host [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.855 187156 DEBUG nova.virt.libvirt.host [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.858 187156 DEBUG nova.virt.libvirt.host [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.859 187156 DEBUG nova.virt.libvirt.host [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.860 187156 DEBUG nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.860 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.861 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.861 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.861 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.861 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.862 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.862 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.862 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.862 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.863 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.863 187156 DEBUG nova.virt.hardware [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.863 187156 DEBUG nova.objects.instance [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.892 187156 DEBUG oslo_concurrency.processutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.953 187156 DEBUG oslo_concurrency.processutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.954 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.955 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.956 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.957 187156 DEBUG nova.virt.libvirt.vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:24Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.958 187156 DEBUG nova.network.os_vif_util [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.959 187156 DEBUG nova.network.os_vif_util [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.962 187156 DEBUG nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <uuid>5818027f-a5b1-465a-a6e2-f0c8f0de8154</uuid>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <name>instance-0000005f</name>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-409239588</nova:name>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:14:34</nova:creationTime>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        <nova:port uuid="07a930ef-a036-4ddf-aa57-c5d56f77847c">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <entry name="serial">5818027f-a5b1-465a-a6e2-f0c8f0de8154</entry>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <entry name="uuid">5818027f-a5b1-465a-a6e2-f0c8f0de8154</entry>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:c4:6c:2d"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <target dev="tap07a930ef-a0"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/console.log" append="off"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:14:34 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:14:34 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:14:34 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:14:34 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.963 187156 DEBUG nova.compute.manager [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Preparing to wait for external event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.963 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.964 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.964 187156 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.965 187156 DEBUG nova.virt.libvirt.vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:24Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.965 187156 DEBUG nova.network.os_vif_util [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.966 187156 DEBUG nova.network.os_vif_util [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.966 187156 DEBUG os_vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.967 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.968 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.968 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.970 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.971 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07a930ef-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.971 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap07a930ef-a0, col_values=(('external_ids', {'iface-id': '07a930ef-a036-4ddf-aa57-c5d56f77847c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:6c:2d', 'vm-uuid': '5818027f-a5b1-465a-a6e2-f0c8f0de8154'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:34 np0005539504 NetworkManager[55210]: <info>  [1764400474.9758] manager: (tap07a930ef-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.976 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.980 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:34 np0005539504 nova_compute[187152]: 2025-11-29 07:14:34.981 187156 INFO os_vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0')#033[00m
Nov 29 02:14:35 np0005539504 kernel: tap07a930ef-a0: entered promiscuous mode
Nov 29 02:14:35 np0005539504 NetworkManager[55210]: <info>  [1764400475.0622] manager: (tap07a930ef-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Nov 29 02:14:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:35Z|00350|binding|INFO|Claiming lport 07a930ef-a036-4ddf-aa57-c5d56f77847c for this chassis.
Nov 29 02:14:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:35Z|00351|binding|INFO|07a930ef-a036-4ddf-aa57-c5d56f77847c: Claiming fa:16:3e:c4:6c:2d 10.100.0.13
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.063 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.074 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:6c:2d 10.100.0.13'], port_security=['fa:16:3e:c4:6c:2d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c188a1f4-7511-4259-992e-c9127e6a414b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ede51bf8-0086-4a77-b4a9-badf8936b8c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aab533cd-f26a-47b5-9334-c93bf39572b9, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=07a930ef-a036-4ddf-aa57-c5d56f77847c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.075 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:35Z|00352|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c ovn-installed in OVS
Nov 29 02:14:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:35Z|00353|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c up in Southbound
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.077 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c in datapath c188a1f4-7511-4259-992e-c9127e6a414b bound to our chassis#033[00m
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.076 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.078 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c188a1f4-7511-4259-992e-c9127e6a414b#033[00m
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.082 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.090 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a13febdb-27ac-4759-8c92-b163f2895f3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.091 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc188a1f4-71 in ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.093 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc188a1f4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.094 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f34a561b-4457-442a-ae91-9e5d3564d07a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.094 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4c2c5f-93a9-4878-99bb-2e63f6bbc1f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 systemd-udevd[230908]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.107 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[e6141901-79b0-4edc-8286-bc05a8dd2dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 systemd-machined[153423]: New machine qemu-49-instance-0000005f.
Nov 29 02:14:35 np0005539504 NetworkManager[55210]: <info>  [1764400475.1158] device (tap07a930ef-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:14:35 np0005539504 NetworkManager[55210]: <info>  [1764400475.1167] device (tap07a930ef-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.121 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[efab0011-a2a5-4e59-b6e2-e17696b3b60f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 systemd[1]: Started Virtual Machine qemu-49-instance-0000005f.
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.153 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[04fcfd56-0947-46b1-b7c6-ddc4074cd143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 systemd-udevd[230913]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:35 np0005539504 NetworkManager[55210]: <info>  [1764400475.1603] manager: (tapc188a1f4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.159 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ec54cc4a-c22f-4db1-bc5c-b0df3b463b80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.191 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[87f7000d-8bb3-4918-a2b4-0cbb3c809752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.195 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[43523d77-d774-4592-95c5-e6ff84838b88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 NetworkManager[55210]: <info>  [1764400475.2158] device (tapc188a1f4-70): carrier: link connected
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.224 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e5cb296a-e477-4b3f-a91b-9c9556ea6d9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.240 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[abcda11d-6eae-4101-8580-177834df33b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc188a1f4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:0a:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593809, 'reachable_time': 35085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230941, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.254 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[84b4b7d6-0a91-4371-b5c2-399ea2f367b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:a87'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593809, 'tstamp': 593809}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230942, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.272 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[51e94c67-7eab-4870-afdf-b6e927ab2ce8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc188a1f4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:0a:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593809, 'reachable_time': 35085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230943, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.309 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ad2221-2ca2-441e-aef3-8de52c7c59cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.315 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.365 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97c1b052-7479-4919-9152-5abd2a90bd5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.367 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc188a1f4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.367 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.367 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc188a1f4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:35 np0005539504 kernel: tapc188a1f4-70: entered promiscuous mode
Nov 29 02:14:35 np0005539504 NetworkManager[55210]: <info>  [1764400475.3703] manager: (tapc188a1f4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.369 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.371 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.376 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc188a1f4-70, col_values=(('external_ids', {'iface-id': 'a383047a-7ad7-4f43-a653-f18a79d8acb1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:35Z|00354|binding|INFO|Releasing lport a383047a-7ad7-4f43-a653-f18a79d8acb1 from this chassis (sb_readonly=0)
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.378 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.382 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.383 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bb5458-06e7-44af-8485-e35ee42625d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.384 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-c188a1f4-7511-4259-992e-c9127e6a414b
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID c188a1f4-7511-4259-992e-c9127e6a414b
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:14:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:35.385 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'env', 'PROCESS_TAG=haproxy-c188a1f4-7511-4259-992e-c9127e6a414b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c188a1f4-7511-4259-992e-c9127e6a414b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:14:35 np0005539504 nova_compute[187152]: 2025-11-29 07:14:35.392 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539504 podman[230974]: 2025-11-29 07:14:35.826227696 +0000 UTC m=+0.117368517 container create bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:14:35 np0005539504 podman[230974]: 2025-11-29 07:14:35.732735141 +0000 UTC m=+0.023875952 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:14:35 np0005539504 systemd[1]: Started libpod-conmon-bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72.scope.
Nov 29 02:14:35 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:14:35 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146419e0be4054d3227eae3bbdfbbac128bc6119b6bf66a23cb2a91514369e0c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:14:35 np0005539504 podman[230974]: 2025-11-29 07:14:35.943038678 +0000 UTC m=+0.234179489 container init bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:14:35 np0005539504 podman[230974]: 2025-11-29 07:14:35.94993265 +0000 UTC m=+0.241073441 container start bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:14:35 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [NOTICE]   (230994) : New worker (230996) forked
Nov 29 02:14:35 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [NOTICE]   (230994) : Loading success.
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.342 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400476.3412576, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.342 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Started (Lifecycle Event)#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.371 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.375 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400476.3418782, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.375 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.399 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.404 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:36 np0005539504 nova_compute[187152]: 2025-11-29 07:14:36.431 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.640 187156 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.641 187156 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.641 187156 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.641 187156 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.641 187156 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Processing event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.642 187156 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.642 187156 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.642 187156 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.643 187156 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.643 187156 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.643 187156 WARNING nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.644 187156 DEBUG nova.compute.manager [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.648 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400477.6483567, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.649 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.653 187156 INFO nova.virt.libvirt.driver [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance running successfully.#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.654 187156 DEBUG nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.698 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.702 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.731 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Nov 29 02:14:37 np0005539504 nova_compute[187152]: 2025-11-29 07:14:37.840 187156 INFO nova.compute.manager [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance to original state: 'active'#033[00m
Nov 29 02:14:38 np0005539504 nova_compute[187152]: 2025-11-29 07:14:38.180 187156 DEBUG nova.network.neutron [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:14:38 np0005539504 nova_compute[187152]: 2025-11-29 07:14:38.181 187156 DEBUG nova.network.neutron [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:38 np0005539504 nova_compute[187152]: 2025-11-29 07:14:38.206 187156 DEBUG oslo_concurrency.lockutils [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:39 np0005539504 podman[231013]: 2025-11-29 07:14:39.713473501 +0000 UTC m=+0.061701215 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:14:39 np0005539504 nova_compute[187152]: 2025-11-29 07:14:39.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:40 np0005539504 nova_compute[187152]: 2025-11-29 07:14:40.317 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.238 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.239 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.266 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.390 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.391 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.413 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.414 187156 INFO nova.compute.claims [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.578 187156 DEBUG nova.compute.provider_tree [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.600 187156 DEBUG nova.scheduler.client.report [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.634 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.635 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.707 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.707 187156 DEBUG nova.network.neutron [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.730 187156 INFO nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.747 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.927 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.929 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.930 187156 INFO nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Creating image(s)#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.931 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "/var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.931 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.932 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.948 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:43 np0005539504 nova_compute[187152]: 2025-11-29 07:14:43.967 187156 DEBUG nova.policy [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.004 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.005 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.006 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.023 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.091 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.093 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.127 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.128 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.129 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.186 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.188 187156 DEBUG nova.virt.disk.api [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Checking if we can resize image /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.188 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.244 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.246 187156 DEBUG nova.virt.disk.api [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Cannot resize image /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.247 187156 DEBUG nova.objects.instance [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid 531c3d01-115b-479d-bbdc-11e38bc8b0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.266 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.267 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Ensure instance console log exists: /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.267 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.268 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:44 np0005539504 nova_compute[187152]: 2025-11-29 07:14:44.268 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:45 np0005539504 nova_compute[187152]: 2025-11-29 07:14:45.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:45 np0005539504 nova_compute[187152]: 2025-11-29 07:14:45.319 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:45 np0005539504 nova_compute[187152]: 2025-11-29 07:14:45.539 187156 DEBUG nova.network.neutron [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Successfully created port: ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:14:45 np0005539504 podman[231048]: 2025-11-29 07:14:45.728006664 +0000 UTC m=+0.059917488 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.971 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.974 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'name': 'tempest-ServerActionsTestOtherB-server-734207825', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000060', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'hostId': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.974 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.974 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>]
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.978 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5818027f-a5b1-465a-a6e2-f0c8f0de8154 / tap07a930ef-a0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.978 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.982 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 084a0f8e-19b7-4b24-a503-c015b26addbc / tap60943dec-d4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.982 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64862bd2-cf5b-498a-8c47-e82d65d07976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:47.975438', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16c7b924-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': 'f7319111d65c9b5a30ffb0fca09823fb4c460134474e32ce9556ed31f24cfec9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:47.975438', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16c83ff2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': 'c512731f668599522c00a429e8a29849bd063318cb873c48d60368ee3e4df584'}]}, 'timestamp': '2025-11-29 07:14:47.982937', '_unique_id': 'afa641663b114908aa11b26235cc71d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.985 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:47.987 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.016 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.017 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.047 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.read.bytes volume: 30763520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.047 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8deb7e0-936f-4bac-b09c-2b9a009eaade', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:47.987539', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16cd7936-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '789f01af59f328bb1c2ab422071e9f2c44d49ab73d1a9b03822d390b8a4479de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:47.987539', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16cd8976-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': 'eaff040057036e913d23ed9433194b80ee3cf2968563aeeec83a66b92a0b7aa8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30763520, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:47.987539', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d22dc8-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '323ca1ea224a5534d676823102994fbcb1d5431552971f969310afe122a209f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:47.987539', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d238ea-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '86b2229fdff2a81abdf6008e80a54a86b24e59e1afb141eeaf59c3771178009a'}]}, 'timestamp': '2025-11-29 07:14:48.048165', '_unique_id': 'edc9ae1fb6ea4997aa4b8fed39056576'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.070 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/cpu volume: 10040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.085 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/cpu volume: 12890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e006e91b-c2be-4791-bf2f-afa4801f864d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10040000000, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'timestamp': '2025-11-29T07:14:48.050314', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '16d5abe2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.004531552, 'message_signature': '3afeb992285fa45b4fedcdd3bf7fba243bcf2b53b74a8aa61c0e8babe364ad25'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12890000000, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'timestamp': '2025-11-29T07:14:48.050314', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '16d8117a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.020379352, 'message_signature': '64123c1c4d6e7042b5d269b9cfd1dc10f3d68cb3c7f9293f9a7f6ec3a964738f'}]}, 'timestamp': '2025-11-29 07:14:48.086609', '_unique_id': '7a28a32be80b4f21bde4d3a42e160509'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.089 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.089 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75258550-e0f1-4cab-9bb5-364dbd60d7d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.089521', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16d896b8-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '7e3f91f27849613a6a5cccd90afa642cecc94efe4ef95f007a1965b71089d41c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.089521', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16d8a590-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '173c9187baecf3a1277028eb4b6a34de0174d4c9a4880143bb248ed45165e95a'}]}, 'timestamp': '2025-11-29 07:14:48.090300', '_unique_id': '788c817a02b44545ae37708d5e6ad43d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.092 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.092 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b186a5f6-2083-41ad-a004-3f6888bd8301', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.092577', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16d90c10-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '794db8605ef9a91250fc955f384acbb21938eab3f27a4a3b3ba15b622acb6ca6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.092577', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16d916ce-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '6d22077a7c039173a01e95da9f02a436c3c64527504e8b9a4929272e1a381637'}]}, 'timestamp': '2025-11-29 07:14:48.093161', '_unique_id': 'e3e9fca76ddb4bd19fcddf5ffc3fda6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.094 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.095 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/memory.usage volume: 42.34765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '055d530e-4e9b-4dcc-a0d6-d12a318fadd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'timestamp': '2025-11-29T07:14:48.094958', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '16d96840-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.004531552, 'message_signature': '8f935ba9adb4e46f6e81fa57b4b400e20f7152f1ba9c7c3669bf723bbfa89294'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.34765625, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'timestamp': '2025-11-29T07:14:48.094958', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '16d972c2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.020379352, 'message_signature': '8c8d39917dcac6fbf0b16bfbe9787da078088bb8bf33591aa50b99001180a8ee'}]}, 'timestamp': '2025-11-29 07:14:48.095542', '_unique_id': '7d11a04c27d84c1a9a0a506341660b92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.097 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.097 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>]
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.097 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.098 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.098 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.write.latency volume: 186251665786 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.098 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c326e357-90d2-4784-8bac-14f7e56bb484', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.097967', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d9dfa0-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '87cba1b21991a2f55ca8577e53cc4da4d89dd718ef95ef1d46fdae484913b554'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.097967', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d9edf6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '58d6a55fc8cb9055fa750be25aa4394295adc8c0e21600bf6a8cf5b557c25de4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 186251665786, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.097967', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d9f9e0-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '6cd3b8545478f2c7c004eaa2ab15dafa19cd8ea6e504cb763ea42ad983a8c148'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.097967', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16da0368-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': 'd5555e0d0bcd66a35eecdfaaa43ae94e33b850cadbd6a77bfb471c9ac269a229'}]}, 'timestamp': '2025-11-29 07:14:48.099210', '_unique_id': 'e535001c25894a5e84afa5b6e3b89b99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.101 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.101 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66f51c4e-fe04-4e01-9cc1-b8d9288d5e6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.101225', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16da5e3a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': 'c32ae156c92a4046b48605a5f23c30959bcfa7716f8af19aed0c7ceb5fa8e77d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.101225', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16da68bc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '68b91bb1186f44f3c8db1db21d2179575ba0996f8d4b3df020abeb6b08a53394'}]}, 'timestamp': '2025-11-29 07:14:48.101808', '_unique_id': 'fa64c77eb2ce4e6f82a3f737593b9bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.103 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.103 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.incoming.bytes volume: 4121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31493b7a-13e6-4ac7-8b11-8b2026321c77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.103558', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16dab7d6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '0f76cdb533bd4cbcb798d9b7e94c4c5e4325ce82082d4924146a73c85b1004a4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4121, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.103558', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16dac7e4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '2424d70941286d2dd4a55a559fc37bcb322d242603dc2d87df03dbc5410c6cc3'}]}, 'timestamp': '2025-11-29 07:14:48.104247', '_unique_id': 'd96706de120f42ce8bbb1ba40afd9cec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.105 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.106 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd107a5c-bb9f-436b-a138-678e1f8b3113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.105889', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16db14a6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '88589282ea7416c0696876d5c12b9866d83deb67432d7f20db2786c9c8e7e392'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.105889', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16db2004-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '4179feaddd888c8675eebbdbab029183b60a700e7349e70366bf9069821e9854'}]}, 'timestamp': '2025-11-29 07:14:48.106503', '_unique_id': '31b09c7fe08342089a798f84bb33e8d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.108 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.108 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c779ad9-fcb6-452c-a5d1-e0365778a965', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.108106', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16db6924-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '78886e1fb1e8ff634773284baca9e3b60a601d5e4ae659fbea7dc2b964ac04b3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.108106', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16db74aa-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '745374490ccd13f45a1c0582ef5083c9a97beba2e3604ee5b5a1a2da623f713d'}]}, 'timestamp': '2025-11-29 07:14:48.108671', '_unique_id': '8de073d7de7c4c419db8c2ff43701465'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>]
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.111 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.111 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5008e04a-59df-414e-a532-d7d200b4d0aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.111163', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16dbe11a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': 'd4cf3b2c352e7bcc6758c64fb1db1d08c57ac2e4f6b8a25bd7d6f8b02e793386'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.111163', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16dbecbe-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': 'c434c1edf9e48108fe214a2b680ef53cdb13750b7907c1a5cd62c17c927b93fb'}]}, 'timestamp': '2025-11-29 07:14:48.111743', '_unique_id': 'b0cf11cb94ef450a9db320f4eb6a6201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.113 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.113 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24f22397-4232-4b33-8d96-77069cadcc25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.113504', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16dc3c50-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '60772d60f3d450ce5b239d5007ad13bdede9dde7259bcc13651ab5182cc7168c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.113504', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16dc465a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '5d2613e08906e6fecfe1892ea612c3acd60e4899ef6b4266aa36bc98702f3582'}]}, 'timestamp': '2025-11-29 07:14:48.114037', '_unique_id': '32c7738592ae4b66a82c0040841e580a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.115 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.read.latency volume: 141411554 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.116 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.read.latency volume: 349009 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.116 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.read.latency volume: 260521458 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.116 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.read.latency volume: 119075687 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb5b7f27-0ec2-4c81-b4f6-7617e43650bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 141411554, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.115889', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dc99a2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': 'b6c675228aa3226f65cdb6f816e4371e1cb696d0f002b88be6a59794b4e5dcd1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 349009, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.115889', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dca370-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '5dc29af06e76e6920398e62fc29abdd648fac981bcdb05986d6423477113ed31'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260521458, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.115889', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dcada2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '3f855fa22faa2aa01a540fe549274416c9abec9de29164b5b8d75702bda1c86b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 119075687, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.115889', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dcb69e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '15032364fe2af1c0e959a8389b90844055b577729b1f35594f035edd1beb6a3f'}]}, 'timestamp': '2025-11-29 07:14:48.116900', '_unique_id': '8be7af97154846e4b9b2c310dc4d2c6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.118 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.119 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.119 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.119 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0694606d-34ba-41db-9150-36db7799df36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.118761', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dd0a90-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '4cff806d9d19dc1ab9ef69baee1bcc235802e4cb5789a60e987e3ccbc14bf0de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.118761', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dd144a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': 'fb12c50523bf0f9c65dadb152536e22c1b3aea369b5331c3a058f3e4f3355394'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.118761', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dd1e7c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': 'c17e9ed8b20ababd0e30e2c8160517dd388f7d3f078aec4513a96f6c53950a22'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.118761', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dd27be-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '5507330e6c75b54d97407bbac27fccb2371c977917134bfca664b50373a2413d'}]}, 'timestamp': '2025-11-29 07:14:48.119820', '_unique_id': '274c62ccafa64bb7bd8be25f3c7c7254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.121 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.121 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '474452e5-1c6a-40d8-ac9e-51ba7608c786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000005f-5818027f-a5b1-465a-a6e2-f0c8f0de8154-tap07a930ef-a0', 'timestamp': '2025-11-29T07:14:48.121677', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'tap07a930ef-a0', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:6c:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap07a930ef-a0'}, 'message_id': '16dd7b7e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.910122154, 'message_signature': '4aa3c1f9795174e5080e9c157de5993c328d948dc3646f57889bd0de50a0a98f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000060-084a0f8e-19b7-4b24-a503-c015b26addbc-tap60943dec-d4', 'timestamp': '2025-11-29T07:14:48.121677', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'tap60943dec-d4', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:06:9e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60943dec-d4'}, 'message_id': '16dd868c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.914297314, 'message_signature': '7b246a641a7b4ef58aa39d8a75a78d8bbbcf1a6133c7350ca536c35c9de700e2'}]}, 'timestamp': '2025-11-29 07:14:48.122235', '_unique_id': '8cdf71b96d784c37837917c1cd9fd63f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.123 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.124 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.124 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.write.bytes volume: 73072640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.124 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f14a94ed-cd7e-478c-9d6d-dde5dc50f71b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.123829', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16ddd038-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '1804caef2c6d52d1ed932552576687f2dd182303574eb3fae8024ab65aa76d3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.123829', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16ddd9de-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '4ff5ff71d918210437de3519e0510fccd5a138490f19309978e2000980db6bdf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73072640, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.123829', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dde3fc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '7537ad4ab336bc6db6cbe9f0421acdf2c8833b845527bae7689de3df6d8d767c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.123829', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dded52-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '35f72d9cc57a1641d94009a797c4f2f7761f7abfe23712470965d4b646b17b03'}]}, 'timestamp': '2025-11-29 07:14:48.124886', '_unique_id': '261eff88507b48d8b1a673f3a979780f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.138 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.139 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.149 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.150 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd407bdb2-694e-483f-9cd5-f7c5041ceffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.126716', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e01f3c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.061408398, 'message_signature': '80b3059b89ceccfd8cfcdf3963fbe1edbd7140c00df68a7d31e2df950910a846'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.126716', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e02d56-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.061408398, 'message_signature': '09cd6b839d89b6d21cbaa96645bdf40ecc3dedae8d22e48e4ed0a0a5e1a03239'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.126716', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e1ce86-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.074290949, 'message_signature': '88cc4f013730642ee06ab1d6c7395b7f181530078daa7eec4a4d9604870443fd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.126716', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e1df66-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.074290949, 'message_signature': '1f8eb1c9f5d8f780651a3bcef22a259424bb0a4d96a44d67e9868b77cb3d1abc'}]}, 'timestamp': '2025-11-29 07:14:48.150761', '_unique_id': '891534785aa54ed0906302bfa2c47134'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.153 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.153 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.153 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.154 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30ead471-0820-4cff-8476-1827d394d109', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.153160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e249ce-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.061408398, 'message_signature': 'bacaae80569608e97f3b1b31c69e42ae79d5e1c63860606727b00f08138c1ebb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.153160', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e2550e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.061408398, 'message_signature': 'dd670b3c63746329ceaf3ceef09e45aecbd33f0c296cc52494a1ddf78a601ff2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.153160', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e26012-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.074290949, 'message_signature': 'dce08e23432183f735b6a94a82e069863be067afcf90c1020c2049d929081186'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.153160', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e26b16-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.074290949, 'message_signature': '483140313a2e53663ed49956be16603fd915aa60238a50396c83af66bd681e8a'}]}, 'timestamp': '2025-11-29 07:14:48.154295', '_unique_id': '48802cac0de0483db8ffe96370ac06cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.156 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.156 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-409239588>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-734207825>]
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.157 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.157 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.157 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.158 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '993d9178-df2b-4ba0-ae50-22461f42e351', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.157084', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e2e438-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.061408398, 'message_signature': 'f65332c376dc0be4ff246af1645f71c2b77f8d5155283b4c3e01d1bdeadb889e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.157084', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e2f20c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.061408398, 'message_signature': 'bde0c07e5a3728ec6df2cb3a85fb994c59432064fd0d8bafbad16a90b38b20d6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.157084', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e2fe14-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.074290949, 'message_signature': '547c4185f0935ffee5f4ff48e13a94af1e01b4300d93d42e4c445bb279a443d7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.157084', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e30ac6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5951.074290949, 'message_signature': '37636d119d240855e55500a27ed0180a6111878af899e41703c213e57868b312'}]}, 'timestamp': '2025-11-29 07:14:48.158448', '_unique_id': '65cb5a16c80d4aa8acb740c5bf72227f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.160 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.161 12 DEBUG ceilometer.compute.pollsters [-] 5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.161 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.161 12 DEBUG ceilometer.compute.pollsters [-] 084a0f8e-19b7-4b24-a503-c015b26addbc/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65e8220d-27b1-4eb7-900d-b5ed1273c359', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-vda', 'timestamp': '2025-11-29T07:14:48.160743', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e37268-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '9a3f506ecd4912f40d9ca2092163498acc13484a1f28080f0db718ec288ec5aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154-sda', 'timestamp': '2025-11-29T07:14:48.160743', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-409239588', 'name': 'instance-0000005f', 'instance_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e37e48-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.922266414, 'message_signature': '9c00ad874afc9100d66acacd376842e025bc3f52e51fcb59d1a1333f7e6e7766'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-vda', 'timestamp': '2025-11-29T07:14:48.160743', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e38a3c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': '817a7b93ba23e8f69c9dcfead9d48cfb9c4660463464f75992dfe5b849f3e926'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '084a0f8e-19b7-4b24-a503-c015b26addbc-sda', 'timestamp': '2025-11-29T07:14:48.160743', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-734207825', 'name': 'instance-00000060', 'instance_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e395d6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 5950.952209168, 'message_signature': 'c3832485158a95f1d4685bd0aede82b22471f4a5410b6216c4995c42985f22c0'}]}, 'timestamp': '2025-11-29 07:14:48.161972', '_unique_id': '54301dd02d7f4db4a0b12c3bacba8c0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:14:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539504 nova_compute[187152]: 2025-11-29 07:14:48.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.310 187156 DEBUG nova.network.neutron [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Successfully updated port: ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.342 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.343 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.343 187156 DEBUG nova.network.neutron [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.490 187156 DEBUG nova.compute.manager [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-changed-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.491 187156 DEBUG nova.compute.manager [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Refreshing instance network info cache due to event network-changed-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.492 187156 DEBUG oslo_concurrency.lockutils [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:49 np0005539504 nova_compute[187152]: 2025-11-29 07:14:49.631 187156 DEBUG nova.network.neutron [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.024 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.320 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.750 187156 DEBUG nova.network.neutron [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.910 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.911 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Instance network_info: |[{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.911 187156 DEBUG oslo_concurrency.lockutils [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.912 187156 DEBUG nova.network.neutron [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Refreshing network info cache for port ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.914 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Start _get_guest_xml network_info=[{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.919 187156 WARNING nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.924 187156 DEBUG nova.virt.libvirt.host [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.925 187156 DEBUG nova.virt.libvirt.host [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.928 187156 DEBUG nova.virt.libvirt.host [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.929 187156 DEBUG nova.virt.libvirt.host [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.930 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.930 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.931 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.931 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.931 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.931 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.931 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.932 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.932 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.932 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.932 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.932 187156 DEBUG nova.virt.hardware [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.937 187156 DEBUG nova.virt.libvirt.vif [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1942369880',display_name='tempest-ServerActionsTestOtherB-server-1942369880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1942369880',id=99,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-1fd30l7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:43Z,user_data=None,user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=531c3d01-115b-479d-bbdc-11e38bc8b0b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.937 187156 DEBUG nova.network.os_vif_util [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.938 187156 DEBUG nova.network.os_vif_util [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:50 np0005539504 nova_compute[187152]: 2025-11-29 07:14:50.939 187156 DEBUG nova.objects.instance [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid 531c3d01-115b-479d-bbdc-11e38bc8b0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:51Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c4:6c:2d 10.100.0.13
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.469 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <uuid>531c3d01-115b-479d-bbdc-11e38bc8b0b1</uuid>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <name>instance-00000063</name>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestOtherB-server-1942369880</nova:name>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:14:50</nova:creationTime>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        <nova:port uuid="ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <entry name="serial">531c3d01-115b-479d-bbdc-11e38bc8b0b1</entry>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <entry name="uuid">531c3d01-115b-479d-bbdc-11e38bc8b0b1</entry>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.config"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:19:f8:4b"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <target dev="tapceb086b6-5a"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/console.log" append="off"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:14:51 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:14:51 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:14:51 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:14:51 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.470 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Preparing to wait for external event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.470 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.470 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.471 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.472 187156 DEBUG nova.virt.libvirt.vif [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1942369880',display_name='tempest-ServerActionsTestOtherB-server-1942369880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1942369880',id=99,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-1fd30l7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:43Z,user_data=None,user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=531c3d01-115b-479d-bbdc-11e38bc8b0b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.472 187156 DEBUG nova.network.os_vif_util [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.473 187156 DEBUG nova.network.os_vif_util [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.474 187156 DEBUG os_vif [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.474 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.475 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.476 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.478 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.479 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapceb086b6-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.479 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapceb086b6-5a, col_values=(('external_ids', {'iface-id': 'ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:f8:4b', 'vm-uuid': '531c3d01-115b-479d-bbdc-11e38bc8b0b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.506 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:51 np0005539504 NetworkManager[55210]: <info>  [1764400491.5079] manager: (tapceb086b6-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.509 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.514 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.515 187156 INFO os_vif [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a')#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.717 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.717 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.717 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No VIF found with MAC fa:16:3e:19:f8:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:14:51 np0005539504 nova_compute[187152]: 2025-11-29 07:14:51.718 187156 INFO nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Using config drive#033[00m
Nov 29 02:14:54 np0005539504 nova_compute[187152]: 2025-11-29 07:14:54.894 187156 INFO nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Creating config drive at /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.config#033[00m
Nov 29 02:14:54 np0005539504 nova_compute[187152]: 2025-11-29 07:14:54.900 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp86rfjnjb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.026 187156 DEBUG oslo_concurrency.processutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp86rfjnjb" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:55 np0005539504 kernel: tapceb086b6-5a: entered promiscuous mode
Nov 29 02:14:55 np0005539504 NetworkManager[55210]: <info>  [1764400495.0888] manager: (tapceb086b6-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.112 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:55Z|00355|binding|INFO|Claiming lport ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 for this chassis.
Nov 29 02:14:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:55Z|00356|binding|INFO|ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3: Claiming fa:16:3e:19:f8:4b 10.100.0.3
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.126 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:f8:4b 10.100.0.3'], port_security=['fa:16:3e:19:f8:4b 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8547e4c2-e200-4173-9eba-476619f06150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:55 np0005539504 systemd-udevd[231092]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.128 104164 INFO neutron.agent.ovn.metadata.agent [-] Port ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 bound to our chassis#033[00m
Nov 29 02:14:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:55Z|00357|binding|INFO|Setting lport ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 ovn-installed in OVS
Nov 29 02:14:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:14:55Z|00358|binding|INFO|Setting lport ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 up in Southbound
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.130 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.131 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.145 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b741fd-2d23-4f64-9d0d-9ed234b46240]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:55 np0005539504 NetworkManager[55210]: <info>  [1764400495.1472] device (tapceb086b6-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:14:55 np0005539504 NetworkManager[55210]: <info>  [1764400495.1480] device (tapceb086b6-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:14:55 np0005539504 systemd-machined[153423]: New machine qemu-50-instance-00000063.
Nov 29 02:14:55 np0005539504 systemd[1]: Started Virtual Machine qemu-50-instance-00000063.
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.177 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d8930f21-1292-4675-aa63-2889e1fe3b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.183 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e03643bc-e7f6-4efc-8917-532fa9574024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.217 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[740dadfd-8ca5-4bd2-9b74-6e96bb725545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.238 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0d791ca2-b26f-4fe8-ac8d-182bdea7ffff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583826, 'reachable_time': 28400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231107, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.261 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f6789364-a881-438f-9b37-715d2e8b08f6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583837, 'tstamp': 583837}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231109, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583840, 'tstamp': 583840}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231109, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.264 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.267 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.268 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.268 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.269 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.269 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.323 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.442 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400495.4414406, 531c3d01-115b-479d-bbdc-11e38bc8b0b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.443 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] VM Started (Lifecycle Event)#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.465 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.470 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400495.44163, 531c3d01-115b-479d-bbdc-11e38bc8b0b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.470 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.497 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.502 187156 DEBUG nova.compute.manager [req-c9c2dd17-21e0-4e68-8dd1-7d377063a57f req-cf2626c7-34bb-4a60-a444-1c57f86ba243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.503 187156 DEBUG oslo_concurrency.lockutils [req-c9c2dd17-21e0-4e68-8dd1-7d377063a57f req-cf2626c7-34bb-4a60-a444-1c57f86ba243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.503 187156 DEBUG oslo_concurrency.lockutils [req-c9c2dd17-21e0-4e68-8dd1-7d377063a57f req-cf2626c7-34bb-4a60-a444-1c57f86ba243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.503 187156 DEBUG oslo_concurrency.lockutils [req-c9c2dd17-21e0-4e68-8dd1-7d377063a57f req-cf2626c7-34bb-4a60-a444-1c57f86ba243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.504 187156 DEBUG nova.compute.manager [req-c9c2dd17-21e0-4e68-8dd1-7d377063a57f req-cf2626c7-34bb-4a60-a444-1c57f86ba243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Processing event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.504 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.505 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.508 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.512 187156 INFO nova.virt.libvirt.driver [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Instance spawned successfully.#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.513 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.534 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.535 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400495.5082967, 531c3d01-115b-479d-bbdc-11e38bc8b0b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.535 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.547 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.548 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.549 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.549 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.551 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.551 187156 DEBUG nova.virt.libvirt.driver [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.556 187156 DEBUG nova.network.neutron [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updated VIF entry in instance network info cache for port ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.556 187156 DEBUG nova.network.neutron [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.561 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.565 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.604 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.618 187156 DEBUG oslo_concurrency.lockutils [req-4009caeb-aa9d-4d3b-828e-87e46f8377f6 req-b8adc036-5d6b-41e4-bd54-bacb1a98573b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.632 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:14:55.633 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.633 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.681 187156 INFO nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Took 11.75 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:14:55 np0005539504 nova_compute[187152]: 2025-11-29 07:14:55.682 187156 DEBUG nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:56 np0005539504 nova_compute[187152]: 2025-11-29 07:14:56.593 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.701 187156 INFO nova.compute.manager [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Took 14.36 seconds to build instance.#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.716 187156 INFO nova.compute.manager [None req-9f4aabbd-bc29-432b-a873-b20414b00e9c bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Get console output#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.725 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.734 187156 DEBUG oslo_concurrency.lockutils [None req-735ce30e-851a-44de-acd0-21fb427327cf ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.494s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.806 187156 DEBUG nova.compute.manager [req-33db547c-6869-461a-923d-802264b37036 req-d87e7c08-3df3-4933-8760-7cb28e17f5ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.807 187156 DEBUG oslo_concurrency.lockutils [req-33db547c-6869-461a-923d-802264b37036 req-d87e7c08-3df3-4933-8760-7cb28e17f5ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.807 187156 DEBUG oslo_concurrency.lockutils [req-33db547c-6869-461a-923d-802264b37036 req-d87e7c08-3df3-4933-8760-7cb28e17f5ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.808 187156 DEBUG oslo_concurrency.lockutils [req-33db547c-6869-461a-923d-802264b37036 req-d87e7c08-3df3-4933-8760-7cb28e17f5ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.808 187156 DEBUG nova.compute.manager [req-33db547c-6869-461a-923d-802264b37036 req-d87e7c08-3df3-4933-8760-7cb28e17f5ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] No waiting events found dispatching network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.808 187156 WARNING nova.compute.manager [req-33db547c-6869-461a-923d-802264b37036 req-d87e7c08-3df3-4933-8760-7cb28e17f5ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received unexpected event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:14:57 np0005539504 nova_compute[187152]: 2025-11-29 07:14:57.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:58 np0005539504 nova_compute[187152]: 2025-11-29 07:14:58.214 187156 INFO nova.compute.manager [None req-f0c31a58-33ac-408d-8bcd-495918eb953d ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Get console output#033[00m
Nov 29 02:14:58 np0005539504 podman[231118]: 2025-11-29 07:14:58.72312196 +0000 UTC m=+0.060716568 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:14:58 np0005539504 podman[231120]: 2025-11-29 07:14:58.72312145 +0000 UTC m=+0.055521101 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:14:58 np0005539504 podman[231119]: 2025-11-29 07:14:58.735356014 +0000 UTC m=+0.069034359 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:14:58 np0005539504 nova_compute[187152]: 2025-11-29 07:14:58.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:59 np0005539504 nova_compute[187152]: 2025-11-29 07:14:59.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:00 np0005539504 nova_compute[187152]: 2025-11-29 07:15:00.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:00 np0005539504 nova_compute[187152]: 2025-11-29 07:15:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:00 np0005539504 nova_compute[187152]: 2025-11-29 07:15:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:00 np0005539504 nova_compute[187152]: 2025-11-29 07:15:00.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.428 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.429 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.429 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.429 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.430 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.462 187156 INFO nova.compute.manager [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Terminating instance#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.492 187156 DEBUG nova.compute.manager [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.494 187156 DEBUG nova.compute.manager [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.495 187156 DEBUG oslo_concurrency.lockutils [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.495 187156 DEBUG oslo_concurrency.lockutils [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.496 187156 DEBUG nova.network.neutron [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.525 187156 DEBUG nova.compute.manager [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:15:01 np0005539504 kernel: tap07a930ef-a0 (unregistering): left promiscuous mode
Nov 29 02:15:01 np0005539504 NetworkManager[55210]: <info>  [1764400501.5586] device (tap07a930ef-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.563 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:01Z|00359|binding|INFO|Releasing lport 07a930ef-a036-4ddf-aa57-c5d56f77847c from this chassis (sb_readonly=0)
Nov 29 02:15:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:01Z|00360|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c down in Southbound
Nov 29 02:15:01 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:01Z|00361|binding|INFO|Removing iface tap07a930ef-a0 ovn-installed in OVS
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.574 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.595 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:01.592 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:6c:2d 10.100.0.13'], port_security=['fa:16:3e:c4:6c:2d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c188a1f4-7511-4259-992e-c9127e6a414b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'ede51bf8-0086-4a77-b4a9-badf8936b8c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aab533cd-f26a-47b5-9334-c93bf39572b9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=07a930ef-a036-4ddf-aa57-c5d56f77847c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:15:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:01.594 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c in datapath c188a1f4-7511-4259-992e-c9127e6a414b unbound from our chassis#033[00m
Nov 29 02:15:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:01.596 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c188a1f4-7511-4259-992e-c9127e6a414b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:15:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:01.597 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[47ed0ab4-5a83-4263-b7c3-fe31bf6a85fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:01.598 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b namespace which is not needed anymore#033[00m
Nov 29 02:15:01 np0005539504 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Nov 29 02:15:01 np0005539504 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005f.scope: Consumed 14.179s CPU time.
Nov 29 02:15:01 np0005539504 systemd-machined[153423]: Machine qemu-49-instance-0000005f terminated.
Nov 29 02:15:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [NOTICE]   (230994) : haproxy version is 2.8.14-c23fe91
Nov 29 02:15:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [NOTICE]   (230994) : path to executable is /usr/sbin/haproxy
Nov 29 02:15:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [WARNING]  (230994) : Exiting Master process...
Nov 29 02:15:01 np0005539504 systemd[1]: libpod-bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72.scope: Deactivated successfully.
Nov 29 02:15:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [ALERT]    (230994) : Current worker (230996) exited with code 143 (Terminated)
Nov 29 02:15:01 np0005539504 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[230990]: [WARNING]  (230994) : All workers exited. Exiting... (0)
Nov 29 02:15:01 np0005539504 podman[231206]: 2025-11-29 07:15:01.779489971 +0000 UTC m=+0.081293023 container died bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.806 187156 INFO nova.virt.libvirt.driver [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance destroyed successfully.#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.807 187156 DEBUG nova.objects.instance [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.833 187156 DEBUG nova.virt.libvirt.vif [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:38Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.834 187156 DEBUG nova.network.os_vif_util [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.835 187156 DEBUG nova.network.os_vif_util [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.835 187156 DEBUG os_vif [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.837 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.838 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07a930ef-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.840 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.842 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72-userdata-shm.mount: Deactivated successfully.
Nov 29 02:15:01 np0005539504 systemd[1]: var-lib-containers-storage-overlay-146419e0be4054d3227eae3bbdfbbac128bc6119b6bf66a23cb2a91514369e0c-merged.mount: Deactivated successfully.
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.849 187156 INFO os_vif [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0')#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.850 187156 INFO nova.virt.libvirt.driver [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Deleting instance files /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_del#033[00m
Nov 29 02:15:01 np0005539504 nova_compute[187152]: 2025-11-29 07:15:01.856 187156 INFO nova.virt.libvirt.driver [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Deletion of /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_del complete#033[00m
Nov 29 02:15:01 np0005539504 podman[231206]: 2025-11-29 07:15:01.904059888 +0000 UTC m=+0.205862930 container cleanup bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:15:01 np0005539504 systemd[1]: libpod-conmon-bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72.scope: Deactivated successfully.
Nov 29 02:15:01 np0005539504 podman[231237]: 2025-11-29 07:15:01.917565225 +0000 UTC m=+0.117784408 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:15:01 np0005539504 podman[231259]: 2025-11-29 07:15:01.991873472 +0000 UTC m=+0.149556419 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.002 187156 INFO nova.compute.manager [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.003 187156 DEBUG oslo.service.loopingcall [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.003 187156 DEBUG nova.compute.manager [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.003 187156 DEBUG nova.network.neutron [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:15:02 np0005539504 podman[231284]: 2025-11-29 07:15:02.021288331 +0000 UTC m=+0.089614253 container remove bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.031 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d081f2-afac-47f3-99fe-291f1b40cb3e]: (4, ('Sat Nov 29 07:15:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b (bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72)\nbdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72\nSat Nov 29 07:15:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b (bdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72)\nbdf91a179e16f7ac385484da3dbc5c275d2977182625c4bfd9f16dce3ae12a72\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.033 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0adc45-2a2f-4d46-b2e2-89acdb2e30a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.034 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc188a1f4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:02 np0005539504 kernel: tapc188a1f4-70: left promiscuous mode
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.042 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bdac552e-f269-4853-9e7b-1786e3302866]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.036 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.063 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[74db7ef6-2863-48b3-9ba6-3472e831a523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.064 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d717d165-b217-4e77-83a1-2663678e9ac3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.086 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[98ab1aba-5974-4dc8-9bfe-930df02e13db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593802, 'reachable_time': 34659, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231314, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.089 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:15:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:02.089 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[457b1043-92a9-477e-8685-b77f1dd678ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:02 np0005539504 systemd[1]: run-netns-ovnmeta\x2dc188a1f4\x2d7511\x2d4259\x2d992e\x2dc9127e6a414b.mount: Deactivated successfully.
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.217 187156 DEBUG nova.compute.manager [None req-275d3cee-b4e1-4367-a83a-86fadf021c3c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Nov 29 02:15:02 np0005539504 nova_compute[187152]: 2025-11-29 07:15:02.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.285 187156 DEBUG nova.compute.manager [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.285 187156 DEBUG oslo_concurrency.lockutils [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.286 187156 DEBUG oslo_concurrency.lockutils [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.286 187156 DEBUG oslo_concurrency.lockutils [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.286 187156 DEBUG nova.compute.manager [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.287 187156 DEBUG nova.compute.manager [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.287 187156 DEBUG nova.compute.manager [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.287 187156 DEBUG oslo_concurrency.lockutils [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.287 187156 DEBUG oslo_concurrency.lockutils [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.288 187156 DEBUG oslo_concurrency.lockutils [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.288 187156 DEBUG nova.compute.manager [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.288 187156 WARNING nova.compute.manager [req-7d45281c-e779-4ea8-884d-2cf964f4688e req-f9a6359e-81bb-4216-b275-cae33971f0ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.625 187156 DEBUG nova.compute.manager [req-2e017bfc-5d54-4292-a4fe-74bca2f539cf req-ca3cd9df-71a2-4158-ac70-3812cf6956a5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-deleted-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.626 187156 INFO nova.compute.manager [req-2e017bfc-5d54-4292-a4fe-74bca2f539cf req-ca3cd9df-71a2-4158-ac70-3812cf6956a5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Neutron deleted interface 07a930ef-a036-4ddf-aa57-c5d56f77847c; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.626 187156 DEBUG nova.network.neutron [req-2e017bfc-5d54-4292-a4fe-74bca2f539cf req-ca3cd9df-71a2-4158-ac70-3812cf6956a5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.629 187156 DEBUG nova.network.neutron [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:03Z|00362|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.725 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.731 187156 INFO nova.compute.manager [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Took 1.73 seconds to deallocate network for instance.#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.735 187156 DEBUG nova.compute.manager [req-2e017bfc-5d54-4292-a4fe-74bca2f539cf req-ca3cd9df-71a2-4158-ac70-3812cf6956a5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Detach interface failed, port_id=07a930ef-a036-4ddf-aa57-c5d56f77847c, reason: Instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.781 187156 DEBUG nova.network.neutron [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.782 187156 DEBUG nova.network.neutron [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.845 187156 DEBUG oslo_concurrency.lockutils [req-2cd1865b-39d2-4bb7-9a10-3bcd8694829a req-c2b1be6b-32af-476a-b19b-88bc6db5c9b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.890 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.891 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.900 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:03 np0005539504 nova_compute[187152]: 2025-11-29 07:15:03.955 187156 INFO nova.scheduler.client.report [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154#033[00m
Nov 29 02:15:04 np0005539504 nova_compute[187152]: 2025-11-29 07:15:04.073 187156 DEBUG oslo_concurrency.lockutils [None req-83824011-b2a9-4149-ac3b-b01435f3e910 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:04.635 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.327 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.516 187156 DEBUG oslo_concurrency.lockutils [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.517 187156 DEBUG oslo_concurrency.lockutils [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.517 187156 DEBUG nova.compute.manager [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.521 187156 DEBUG nova.compute.manager [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.522 187156 DEBUG nova.objects.instance [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'flavor' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.550 187156 DEBUG nova.objects.instance [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'info_cache' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.634 187156 DEBUG nova.virt.libvirt.driver [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:15:05 np0005539504 nova_compute[187152]: 2025-11-29 07:15:05.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.011 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.012 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.012 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.012 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.259 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.323 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.325 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.380 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.387 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.457 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.458 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.545 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.712 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.713 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5396MB free_disk=73.16316986083984GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.714 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.714 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:06 np0005539504 nova_compute[187152]: 2025-11-29 07:15:06.845 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.173 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 084a0f8e-19b7-4b24-a503-c015b26addbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.173 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.173 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.174 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.246 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.318 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.401 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:15:07 np0005539504 nova_compute[187152]: 2025-11-29 07:15:07.402 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:08 np0005539504 kernel: tap60943dec-d4 (unregistering): left promiscuous mode
Nov 29 02:15:08 np0005539504 NetworkManager[55210]: <info>  [1764400508.0066] device (tap60943dec-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:15:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:08Z|00363|binding|INFO|Releasing lport 60943dec-d420-449f-abc3-233df163ebed from this chassis (sb_readonly=0)
Nov 29 02:15:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:08Z|00364|binding|INFO|Setting lport 60943dec-d420-449f-abc3-233df163ebed down in Southbound
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.056 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:08Z|00365|binding|INFO|Removing iface tap60943dec-d4 ovn-installed in OVS
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.078 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.098 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:06:9e 10.100.0.9'], port_security=['fa:16:3e:04:06:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=60943dec-d420-449f-abc3-233df163ebed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.100 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 60943dec-d420-449f-abc3-233df163ebed in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 unbound from our chassis#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.101 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410#033[00m
Nov 29 02:15:08 np0005539504 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000060.scope: Deactivated successfully.
Nov 29 02:15:08 np0005539504 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000060.scope: Consumed 19.274s CPU time.
Nov 29 02:15:08 np0005539504 systemd-machined[153423]: Machine qemu-47-instance-00000060 terminated.
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.122 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1547ff57-93aa-4b65-a7ad-16766a10a2e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.152 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c7104e18-f5b9-4e13-b0ba-8a7db7657ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.155 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8d7bae-d860-4273-8367-fcd762584398]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.185 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba23645-03f1-45db-8aaa-a353737f1e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.210 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ba470f50-d485-4739-879b-6a5aa04e3e0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583826, 'reachable_time': 28400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231361, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.235 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d319d45e-1542-4b9d-ae2c-1866242b1367]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583837, 'tstamp': 583837}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231362, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583840, 'tstamp': 583840}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231362, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.238 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.239 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.244 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.245 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.245 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.246 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:08.246 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.588 187156 DEBUG nova.compute.manager [req-28972e4c-3163-4fcd-9ebc-e8a9e5e9d9e6 req-3322261a-7cdd-4c06-b158-1b83f686fb4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-unplugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.589 187156 DEBUG oslo_concurrency.lockutils [req-28972e4c-3163-4fcd-9ebc-e8a9e5e9d9e6 req-3322261a-7cdd-4c06-b158-1b83f686fb4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.589 187156 DEBUG oslo_concurrency.lockutils [req-28972e4c-3163-4fcd-9ebc-e8a9e5e9d9e6 req-3322261a-7cdd-4c06-b158-1b83f686fb4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.589 187156 DEBUG oslo_concurrency.lockutils [req-28972e4c-3163-4fcd-9ebc-e8a9e5e9d9e6 req-3322261a-7cdd-4c06-b158-1b83f686fb4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.589 187156 DEBUG nova.compute.manager [req-28972e4c-3163-4fcd-9ebc-e8a9e5e9d9e6 req-3322261a-7cdd-4c06-b158-1b83f686fb4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-unplugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.589 187156 WARNING nova.compute.manager [req-28972e4c-3163-4fcd-9ebc-e8a9e5e9d9e6 req-3322261a-7cdd-4c06-b158-1b83f686fb4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received unexpected event network-vif-unplugged-60943dec-d420-449f-abc3-233df163ebed for instance with vm_state active and task_state powering-off.#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.652 187156 INFO nova.virt.libvirt.driver [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.659 187156 INFO nova.virt.libvirt.driver [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance destroyed successfully.#033[00m
Nov 29 02:15:08 np0005539504 nova_compute[187152]: 2025-11-29 07:15:08.659 187156 DEBUG nova.objects.instance [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'numa_topology' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:09 np0005539504 nova_compute[187152]: 2025-11-29 07:15:09.122 187156 DEBUG nova.compute.manager [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:09Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:f8:4b 10.100.0.3
Nov 29 02:15:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:09Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:f8:4b 10.100.0.3
Nov 29 02:15:09 np0005539504 nova_compute[187152]: 2025-11-29 07:15:09.401 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:09 np0005539504 nova_compute[187152]: 2025-11-29 07:15:09.401 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:15:09 np0005539504 nova_compute[187152]: 2025-11-29 07:15:09.402 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:15:09 np0005539504 nova_compute[187152]: 2025-11-29 07:15:09.432 187156 DEBUG oslo_concurrency.lockutils [None req-dc7ec573-b986-41f7-a8f3-819868611ca2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:10 np0005539504 nova_compute[187152]: 2025-11-29 07:15:10.288 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:10 np0005539504 nova_compute[187152]: 2025-11-29 07:15:10.289 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:10 np0005539504 nova_compute[187152]: 2025-11-29 07:15:10.289 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:15:10 np0005539504 nova_compute[187152]: 2025-11-29 07:15:10.289 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:10 np0005539504 nova_compute[187152]: 2025-11-29 07:15:10.329 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:10 np0005539504 podman[231381]: 2025-11-29 07:15:10.721420121 +0000 UTC m=+0.066529461 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:15:11 np0005539504 ovn_controller[95182]: 2025-11-29T07:15:11Z|00366|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.315 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.548 187156 DEBUG nova.compute.manager [req-ccc65ed4-f743-4bc1-8a93-a4074409d591 req-ed7bc678-4929-438a-b7ba-97091fdac243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.548 187156 DEBUG oslo_concurrency.lockutils [req-ccc65ed4-f743-4bc1-8a93-a4074409d591 req-ed7bc678-4929-438a-b7ba-97091fdac243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.548 187156 DEBUG oslo_concurrency.lockutils [req-ccc65ed4-f743-4bc1-8a93-a4074409d591 req-ed7bc678-4929-438a-b7ba-97091fdac243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.549 187156 DEBUG oslo_concurrency.lockutils [req-ccc65ed4-f743-4bc1-8a93-a4074409d591 req-ed7bc678-4929-438a-b7ba-97091fdac243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.549 187156 DEBUG nova.compute.manager [req-ccc65ed4-f743-4bc1-8a93-a4074409d591 req-ed7bc678-4929-438a-b7ba-97091fdac243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.549 187156 WARNING nova.compute.manager [req-ccc65ed4-f743-4bc1-8a93-a4074409d591 req-ed7bc678-4929-438a-b7ba-97091fdac243 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received unexpected event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed for instance with vm_state stopped and task_state None.#033[00m
Nov 29 02:15:11 np0005539504 nova_compute[187152]: 2025-11-29 07:15:11.848 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:13 np0005539504 nova_compute[187152]: 2025-11-29 07:15:13.481 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:13 np0005539504 nova_compute[187152]: 2025-11-29 07:15:13.509 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:13 np0005539504 nova_compute[187152]: 2025-11-29 07:15:13.510 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:15:15 np0005539504 nova_compute[187152]: 2025-11-29 07:15:15.331 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:16 np0005539504 podman[231401]: 2025-11-29 07:15:16.718864722 +0000 UTC m=+0.063017069 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.761 187156 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.762 187156 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.762 187156 DEBUG nova.network.neutron [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.785 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400501.7841272, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.786 187156 INFO nova.compute.manager [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.850 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:16 np0005539504 nova_compute[187152]: 2025-11-29 07:15:16.973 187156 DEBUG nova.compute.manager [None req-ddf555ea-55a6-45ac-b9c0-aee0b2d553f7 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:20 np0005539504 nova_compute[187152]: 2025-11-29 07:15:20.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:21 np0005539504 nova_compute[187152]: 2025-11-29 07:15:21.488 187156 DEBUG nova.network.neutron [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:21 np0005539504 nova_compute[187152]: 2025-11-29 07:15:21.769 187156 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:21 np0005539504 nova_compute[187152]: 2025-11-29 07:15:21.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:22.932 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:22.932 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:22.933 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:23 np0005539504 nova_compute[187152]: 2025-11-29 07:15:23.335 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400508.3332791, 084a0f8e-19b7-4b24-a503-c015b26addbc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:23 np0005539504 nova_compute[187152]: 2025-11-29 07:15:23.336 187156 INFO nova.compute.manager [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:15:23 np0005539504 nova_compute[187152]: 2025-11-29 07:15:23.603 187156 DEBUG nova.compute.manager [None req-c627d716-a2a2-4577-8e8a-a8053a40b046 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:23 np0005539504 nova_compute[187152]: 2025-11-29 07:15:23.608 187156 DEBUG nova.compute.manager [None req-c627d716-a2a2-4577-8e8a-a8053a40b046 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: resize_migrating, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:15:23 np0005539504 nova_compute[187152]: 2025-11-29 07:15:23.656 187156 INFO nova.compute.manager [None req-c627d716-a2a2-4577-8e8a-a8053a40b046 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] During sync_power_state the instance has a pending task (resize_migrating). Skip.#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.011 187156 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.012 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Creating file /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/704947a76b5a4c3fa23dc95865680b19.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.013 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/704947a76b5a4c3fa23dc95865680b19.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.592 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/704947a76b5a4c3fa23dc95865680b19.tmp" returned: 1 in 0.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.593 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/704947a76b5a4c3fa23dc95865680b19.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.594 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Creating directory /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.594 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.842 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.847 187156 INFO nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance already shutdown.#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.854 187156 INFO nova.virt.libvirt.driver [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance destroyed successfully.#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.855 187156 DEBUG nova.virt.libvirt.vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:12:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.856 187156 DEBUG nova.network.os_vif_util [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.856 187156 DEBUG nova.network.os_vif_util [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.857 187156 DEBUG os_vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.859 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.860 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60943dec-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.861 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.865 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.867 187156 INFO os_vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.872 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.945 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:24 np0005539504 nova_compute[187152]: 2025-11-29 07:15:24.947 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.003 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.005 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Copying file /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk to 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.006 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.337 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.606 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "scp -r /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk" returned: 0 in 0.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.608 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Copying file /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.609 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk.config 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.870 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "scp -C -r /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk.config 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config" returned: 0 in 0.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.872 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Copying file /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:15:25 np0005539504 nova_compute[187152]: 2025-11-29 07:15:25.872 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk.info 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:26 np0005539504 nova_compute[187152]: 2025-11-29 07:15:26.336 187156 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "scp -C -r /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_resize/disk.info 192.168.122.100:/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:26 np0005539504 nova_compute[187152]: 2025-11-29 07:15:26.969 187156 DEBUG neutronclient.v2_0.client [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 60943dec-d420-449f-abc3-233df163ebed for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:15:28 np0005539504 nova_compute[187152]: 2025-11-29 07:15:28.101 187156 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:28 np0005539504 nova_compute[187152]: 2025-11-29 07:15:28.102 187156 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:28 np0005539504 nova_compute[187152]: 2025-11-29 07:15:28.103 187156 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:28 np0005539504 nova_compute[187152]: 2025-11-29 07:15:28.169 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:29 np0005539504 nova_compute[187152]: 2025-11-29 07:15:29.663 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:29 np0005539504 podman[231437]: 2025-11-29 07:15:29.75255356 +0000 UTC m=+0.067604831 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:15:29 np0005539504 podman[231439]: 2025-11-29 07:15:29.75295657 +0000 UTC m=+0.060909523 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:15:29 np0005539504 podman[231438]: 2025-11-29 07:15:29.760832018 +0000 UTC m=+0.075848628 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Nov 29 02:15:29 np0005539504 nova_compute[187152]: 2025-11-29 07:15:29.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:30 np0005539504 nova_compute[187152]: 2025-11-29 07:15:30.339 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:30 np0005539504 nova_compute[187152]: 2025-11-29 07:15:30.418 187156 DEBUG nova.compute.manager [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-changed-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:30 np0005539504 nova_compute[187152]: 2025-11-29 07:15:30.419 187156 DEBUG nova.compute.manager [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing instance network info cache due to event network-changed-60943dec-d420-449f-abc3-233df163ebed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:15:30 np0005539504 nova_compute[187152]: 2025-11-29 07:15:30.419 187156 DEBUG oslo_concurrency.lockutils [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:30 np0005539504 nova_compute[187152]: 2025-11-29 07:15:30.420 187156 DEBUG oslo_concurrency.lockutils [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:30 np0005539504 nova_compute[187152]: 2025-11-29 07:15:30.420 187156 DEBUG nova.network.neutron [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing network info cache for port 60943dec-d420-449f-abc3-233df163ebed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:15:31 np0005539504 nova_compute[187152]: 2025-11-29 07:15:31.108 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:32 np0005539504 nova_compute[187152]: 2025-11-29 07:15:32.112 187156 DEBUG nova.network.neutron [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updated VIF entry in instance network info cache for port 60943dec-d420-449f-abc3-233df163ebed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:15:32 np0005539504 nova_compute[187152]: 2025-11-29 07:15:32.113 187156 DEBUG nova.network.neutron [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:32 np0005539504 nova_compute[187152]: 2025-11-29 07:15:32.136 187156 DEBUG oslo_concurrency.lockutils [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:32 np0005539504 podman[231498]: 2025-11-29 07:15:32.724687791 +0000 UTC m=+0.063765659 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:15:32 np0005539504 podman[231499]: 2025-11-29 07:15:32.762986935 +0000 UTC m=+0.099832224 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:15:34 np0005539504 nova_compute[187152]: 2025-11-29 07:15:34.710 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:34 np0005539504 nova_compute[187152]: 2025-11-29 07:15:34.711 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:34 np0005539504 nova_compute[187152]: 2025-11-29 07:15:34.711 187156 DEBUG nova.compute.manager [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Going to confirm migration 16 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 29 02:15:34 np0005539504 nova_compute[187152]: 2025-11-29 07:15:34.758 187156 DEBUG nova.objects.instance [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'info_cache' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:34 np0005539504 nova_compute[187152]: 2025-11-29 07:15:34.916 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:35 np0005539504 nova_compute[187152]: 2025-11-29 07:15:35.342 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:35 np0005539504 nova_compute[187152]: 2025-11-29 07:15:35.512 187156 DEBUG neutronclient.v2_0.client [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 60943dec-d420-449f-abc3-233df163ebed for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:15:35 np0005539504 nova_compute[187152]: 2025-11-29 07:15:35.513 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:35 np0005539504 nova_compute[187152]: 2025-11-29 07:15:35.513 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:35 np0005539504 nova_compute[187152]: 2025-11-29 07:15:35.514 187156 DEBUG nova.network.neutron [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:15:36 np0005539504 nova_compute[187152]: 2025-11-29 07:15:36.143 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.504 187156 DEBUG nova.network.neutron [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.532 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.533 187156 DEBUG nova.objects.instance [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.574 187156 DEBUG nova.virt.libvirt.vif [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.576 187156 DEBUG nova.network.os_vif_util [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.577 187156 DEBUG nova.network.os_vif_util [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.577 187156 DEBUG os_vif [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.580 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.580 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60943dec-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.581 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.584 187156 INFO os_vif [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.585 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.585 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:37 np0005539504 nova_compute[187152]: 2025-11-29 07:15:37.685 187156 DEBUG nova.compute.provider_tree [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:15:38 np0005539504 nova_compute[187152]: 2025-11-29 07:15:38.024 187156 DEBUG nova.scheduler.client.report [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:15:39 np0005539504 nova_compute[187152]: 2025-11-29 07:15:39.150 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 1.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:39 np0005539504 nova_compute[187152]: 2025-11-29 07:15:39.151 187156 DEBUG nova.compute.manager [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4805#033[00m
Nov 29 02:15:39 np0005539504 nova_compute[187152]: 2025-11-29 07:15:39.601 187156 INFO nova.scheduler.client.report [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Deleted allocation for migration 851ef7a8-9a43-4d1f-809d-562a326079bb#033[00m
Nov 29 02:15:39 np0005539504 nova_compute[187152]: 2025-11-29 07:15:39.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:40 np0005539504 nova_compute[187152]: 2025-11-29 07:15:40.344 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:41 np0005539504 podman[231547]: 2025-11-29 07:15:41.720985311 +0000 UTC m=+0.066306266 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 29 02:15:44 np0005539504 nova_compute[187152]: 2025-11-29 07:15:44.923 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:45 np0005539504 nova_compute[187152]: 2025-11-29 07:15:45.346 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:46 np0005539504 nova_compute[187152]: 2025-11-29 07:15:46.532 187156 DEBUG oslo_concurrency.lockutils [None req-cb801371-2312-490f-90bb-6a890f76a859 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 11.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:47 np0005539504 podman[231567]: 2025-11-29 07:15:47.714906058 +0000 UTC m=+0.064666142 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:15:49 np0005539504 nova_compute[187152]: 2025-11-29 07:15:49.926 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:50 np0005539504 nova_compute[187152]: 2025-11-29 07:15:50.350 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:50 np0005539504 nova_compute[187152]: 2025-11-29 07:15:50.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:54 np0005539504 nova_compute[187152]: 2025-11-29 07:15:54.928 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539504 nova_compute[187152]: 2025-11-29 07:15:55.370 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:55.935 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:15:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:55.937 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:15:55 np0005539504 nova_compute[187152]: 2025-11-29 07:15:55.937 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:58 np0005539504 nova_compute[187152]: 2025-11-29 07:15:58.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:15:58.939 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:59 np0005539504 nova_compute[187152]: 2025-11-29 07:15:59.930 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:59 np0005539504 nova_compute[187152]: 2025-11-29 07:15:59.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:00 np0005539504 nova_compute[187152]: 2025-11-29 07:16:00.372 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:00 np0005539504 podman[231592]: 2025-11-29 07:16:00.732925321 +0000 UTC m=+0.062984118 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 02:16:00 np0005539504 podman[231590]: 2025-11-29 07:16:00.736589229 +0000 UTC m=+0.075318825 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:16:00 np0005539504 podman[231591]: 2025-11-29 07:16:00.742064494 +0000 UTC m=+0.072055569 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public)
Nov 29 02:16:00 np0005539504 nova_compute[187152]: 2025-11-29 07:16:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:00 np0005539504 nova_compute[187152]: 2025-11-29 07:16:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:01 np0005539504 nova_compute[187152]: 2025-11-29 07:16:01.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:01 np0005539504 nova_compute[187152]: 2025-11-29 07:16:01.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:16:03 np0005539504 podman[231647]: 2025-11-29 07:16:03.710921418 +0000 UTC m=+0.052248834 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:16:03 np0005539504 podman[231648]: 2025-11-29 07:16:03.799340418 +0000 UTC m=+0.130534846 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:16:04 np0005539504 nova_compute[187152]: 2025-11-29 07:16:04.934 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539504 nova_compute[187152]: 2025-11-29 07:16:04.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:05 np0005539504 nova_compute[187152]: 2025-11-29 07:16:05.374 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:06 np0005539504 nova_compute[187152]: 2025-11-29 07:16:06.633 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:06 np0005539504 nova_compute[187152]: 2025-11-29 07:16:06.636 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:06 np0005539504 nova_compute[187152]: 2025-11-29 07:16:06.684 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:16:06 np0005539504 nova_compute[187152]: 2025-11-29 07:16:06.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.039 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.040 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.040 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.040 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.184 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.253 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.254 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.316 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.317 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.323 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.324 187156 INFO nova.compute.claims [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.328 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.546 187156 DEBUG nova.compute.provider_tree [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.565 187156 DEBUG nova.scheduler.client.report [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.617 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.618 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.628 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.631 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5551MB free_disk=73.16368865966797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.631 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.632 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.711 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.712 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 99126e58-be6b-4a8d-bd7e-82d08cc3b61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.712 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.712 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.720 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.720 187156 DEBUG nova.network.neutron [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.740 187156 INFO nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.760 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.802 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:16:07 np0005539504 nova_compute[187152]: 2025-11-29 07:16:07.820 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:16:08 np0005539504 nova_compute[187152]: 2025-11-29 07:16:08.011 187156 DEBUG nova.policy [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:16:09 np0005539504 nova_compute[187152]: 2025-11-29 07:16:09.936 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.207 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.208 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.388 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.517 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.520 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.520 187156 INFO nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Creating image(s)#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.521 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.521 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.522 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.537 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.594 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.595 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.596 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.608 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.679 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.680 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.723 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.724 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.724 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.802 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.803 187156 DEBUG nova.virt.disk.api [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Checking if we can resize image /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.804 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.870 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.871 187156 DEBUG nova.virt.disk.api [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Cannot resize image /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.872 187156 DEBUG nova.objects.instance [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.884 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.885 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Ensure instance console log exists: /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.885 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.885 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:10 np0005539504 nova_compute[187152]: 2025-11-29 07:16:10.886 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:11 np0005539504 nova_compute[187152]: 2025-11-29 07:16:11.027 187156 DEBUG nova.network.neutron [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Successfully created port: 160dd2b8-54e7-490c-8d0e-b15f57edcc04 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:16:12 np0005539504 nova_compute[187152]: 2025-11-29 07:16:12.209 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:12 np0005539504 nova_compute[187152]: 2025-11-29 07:16:12.245 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:12 np0005539504 nova_compute[187152]: 2025-11-29 07:16:12.245 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:16:12 np0005539504 podman[231720]: 2025-11-29 07:16:12.731679095 +0000 UTC m=+0.068489724 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:16:12 np0005539504 nova_compute[187152]: 2025-11-29 07:16:12.864 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:12 np0005539504 nova_compute[187152]: 2025-11-29 07:16:12.864 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:12 np0005539504 nova_compute[187152]: 2025-11-29 07:16:12.865 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:16:14 np0005539504 nova_compute[187152]: 2025-11-29 07:16:14.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.415 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.672 187156 DEBUG nova.network.neutron [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Successfully updated port: 160dd2b8-54e7-490c-8d0e-b15f57edcc04 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.690 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.691 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.691 187156 DEBUG nova.network.neutron [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.766 187156 DEBUG nova.compute.manager [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.767 187156 DEBUG nova.compute.manager [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing instance network info cache due to event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:15 np0005539504 nova_compute[187152]: 2025-11-29 07:16:15.767 187156 DEBUG oslo_concurrency.lockutils [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:16 np0005539504 nova_compute[187152]: 2025-11-29 07:16:16.910 187156 DEBUG nova.network.neutron [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:16:17 np0005539504 nova_compute[187152]: 2025-11-29 07:16:17.325 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:17 np0005539504 nova_compute[187152]: 2025-11-29 07:16:17.349 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:17 np0005539504 nova_compute[187152]: 2025-11-29 07:16:17.350 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:16:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:17Z|00367|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:16:17 np0005539504 nova_compute[187152]: 2025-11-29 07:16:17.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:18Z|00368|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.128 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.282 187156 DEBUG nova.network.neutron [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.302 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.302 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance network_info: |[{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.303 187156 DEBUG oslo_concurrency.lockutils [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.303 187156 DEBUG nova.network.neutron [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.305 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Start _get_guest_xml network_info=[{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.310 187156 WARNING nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.316 187156 DEBUG nova.virt.libvirt.host [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.316 187156 DEBUG nova.virt.libvirt.host [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.324 187156 DEBUG nova.virt.libvirt.host [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.325 187156 DEBUG nova.virt.libvirt.host [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.327 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.327 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.328 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.328 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.328 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.329 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.329 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.329 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.329 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.330 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.330 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.330 187156 DEBUG nova.virt.hardware [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.335 187156 DEBUG nova.virt.libvirt.vif [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305320144',display_name='tempest-ServerActionsTestOtherB-server-1305320144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305320144',id=103,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-18bazq8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:16:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=99126e58-be6b-4a8d-bd7e-82d08cc3b61b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.335 187156 DEBUG nova.network.os_vif_util [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.336 187156 DEBUG nova.network.os_vif_util [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.337 187156 DEBUG nova.objects.instance [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.352 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <uuid>99126e58-be6b-4a8d-bd7e-82d08cc3b61b</uuid>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <name>instance-00000067</name>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerActionsTestOtherB-server-1305320144</nova:name>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:16:18</nova:creationTime>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        <nova:port uuid="160dd2b8-54e7-490c-8d0e-b15f57edcc04">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <entry name="serial">99126e58-be6b-4a8d-bd7e-82d08cc3b61b</entry>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <entry name="uuid">99126e58-be6b-4a8d-bd7e-82d08cc3b61b</entry>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:55:12:21"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <target dev="tap160dd2b8-54"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/console.log" append="off"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:16:18 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:16:18 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:16:18 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:16:18 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.353 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Preparing to wait for external event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.353 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.354 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.354 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.355 187156 DEBUG nova.virt.libvirt.vif [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305320144',display_name='tempest-ServerActionsTestOtherB-server-1305320144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305320144',id=103,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-18bazq8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:16:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=99126e58-be6b-4a8d-bd7e-82d08cc3b61b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.355 187156 DEBUG nova.network.os_vif_util [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.357 187156 DEBUG nova.network.os_vif_util [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.357 187156 DEBUG os_vif [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.358 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.358 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.359 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.362 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.362 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap160dd2b8-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.363 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap160dd2b8-54, col_values=(('external_ids', {'iface-id': '160dd2b8-54e7-490c-8d0e-b15f57edcc04', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:12:21', 'vm-uuid': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.364 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539504 NetworkManager[55210]: <info>  [1764400578.3662] manager: (tap160dd2b8-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.368 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.371 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.372 187156 INFO os_vif [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54')#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.420 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.420 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.420 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No VIF found with MAC fa:16:3e:55:12:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:16:18 np0005539504 nova_compute[187152]: 2025-11-29 07:16:18.421 187156 INFO nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Using config drive#033[00m
Nov 29 02:16:18 np0005539504 podman[231742]: 2025-11-29 07:16:18.721034721 +0000 UTC m=+0.061469277 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:16:19 np0005539504 nova_compute[187152]: 2025-11-29 07:16:19.833 187156 INFO nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Creating config drive at /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config#033[00m
Nov 29 02:16:19 np0005539504 nova_compute[187152]: 2025-11-29 07:16:19.839 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp_qdc790 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:19 np0005539504 nova_compute[187152]: 2025-11-29 07:16:19.965 187156 DEBUG oslo_concurrency.processutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp_qdc790" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:20 np0005539504 kernel: tap160dd2b8-54: entered promiscuous mode
Nov 29 02:16:20 np0005539504 NetworkManager[55210]: <info>  [1764400580.0322] manager: (tap160dd2b8-54): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Nov 29 02:16:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:20Z|00369|binding|INFO|Claiming lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 for this chassis.
Nov 29 02:16:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:20Z|00370|binding|INFO|160dd2b8-54e7-490c-8d0e-b15f57edcc04: Claiming fa:16:3e:55:12:21 10.100.0.12
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.032 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:20Z|00371|binding|INFO|Setting lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 ovn-installed in OVS
Nov 29 02:16:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:20Z|00372|binding|INFO|Setting lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 up in Southbound
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.048 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.049 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:12:21 10.100.0.12'], port_security=['fa:16:3e:55:12:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=160dd2b8-54e7-490c-8d0e-b15f57edcc04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.050 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.072 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 bound to our chassis#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.074 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410#033[00m
Nov 29 02:16:20 np0005539504 systemd-udevd[231778]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.095 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b89eb7dd-cd38-44fd-97aa-837b3fdfa75c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:20 np0005539504 NetworkManager[55210]: <info>  [1764400580.1161] device (tap160dd2b8-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:16:20 np0005539504 NetworkManager[55210]: <info>  [1764400580.1169] device (tap160dd2b8-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.123 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.133 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea4e9e2-198d-4a05-b2b6-78eb92afb4ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.136 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ed414944-c99c-447c-a6b4-edff045c0e7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:20 np0005539504 systemd-machined[153423]: New machine qemu-51-instance-00000067.
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.156 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e163c6fd-2e52-406b-baf8-fb9cb86ec1a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:20 np0005539504 systemd[1]: Started Virtual Machine qemu-51-instance-00000067.
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.173 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a72a3b43-7365-4a6a-9a04-5cf25a35b815]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583826, 'reachable_time': 28400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231788, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.191 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eb425046-522a-4da2-9fe7-ea379b1c18a4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583837, 'tstamp': 583837}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231789, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583840, 'tstamp': 583840}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231789, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.193 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.194 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.195 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.196 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.196 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.196 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:20.196 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.421 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.427 187156 DEBUG nova.compute.manager [req-21b04b5f-8b76-4222-8911-40bb42ac82d5 req-d4160e6a-31b5-4338-bda7-c6ad8ff2b89b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.427 187156 DEBUG oslo_concurrency.lockutils [req-21b04b5f-8b76-4222-8911-40bb42ac82d5 req-d4160e6a-31b5-4338-bda7-c6ad8ff2b89b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.427 187156 DEBUG oslo_concurrency.lockutils [req-21b04b5f-8b76-4222-8911-40bb42ac82d5 req-d4160e6a-31b5-4338-bda7-c6ad8ff2b89b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.428 187156 DEBUG oslo_concurrency.lockutils [req-21b04b5f-8b76-4222-8911-40bb42ac82d5 req-d4160e6a-31b5-4338-bda7-c6ad8ff2b89b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.428 187156 DEBUG nova.compute.manager [req-21b04b5f-8b76-4222-8911-40bb42ac82d5 req-d4160e6a-31b5-4338-bda7-c6ad8ff2b89b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Processing event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.521 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.522 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400580.5213704, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.523 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.527 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.533 187156 INFO nova.virt.libvirt.driver [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance spawned successfully.#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.534 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.543 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.548 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.575 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.576 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.577 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.577 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.577 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.578 187156 DEBUG nova.virt.libvirt.driver [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.582 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.583 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400580.5215144, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.583 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.620 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.624 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400580.5251586, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.625 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.654 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.659 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.687 187156 INFO nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Took 10.17 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.687 187156 DEBUG nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.702 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.811 187156 INFO nova.compute.manager [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Took 13.64 seconds to build instance.#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.830 187156 DEBUG oslo_concurrency.lockutils [None req-b042048b-d251-4e91-bd94-46c5e7f245a8 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.853 187156 DEBUG nova.network.neutron [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updated VIF entry in instance network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.854 187156 DEBUG nova.network.neutron [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:20 np0005539504 nova_compute[187152]: 2025-11-29 07:16:20.871 187156 DEBUG oslo_concurrency.lockutils [req-cf52c970-51ff-42c7-9607-2a27596d1199 req-12eb9c02-8d07-4e20-a084-6bd690aec454 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.804 187156 DEBUG nova.compute.manager [req-5b8762ce-2423-4632-a85a-c7bfce415014 req-1382874e-c953-41d3-81de-54d07b5eeb2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.804 187156 DEBUG oslo_concurrency.lockutils [req-5b8762ce-2423-4632-a85a-c7bfce415014 req-1382874e-c953-41d3-81de-54d07b5eeb2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.805 187156 DEBUG oslo_concurrency.lockutils [req-5b8762ce-2423-4632-a85a-c7bfce415014 req-1382874e-c953-41d3-81de-54d07b5eeb2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.805 187156 DEBUG oslo_concurrency.lockutils [req-5b8762ce-2423-4632-a85a-c7bfce415014 req-1382874e-c953-41d3-81de-54d07b5eeb2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.806 187156 DEBUG nova.compute.manager [req-5b8762ce-2423-4632-a85a-c7bfce415014 req-1382874e-c953-41d3-81de-54d07b5eeb2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] No waiting events found dispatching network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.806 187156 WARNING nova.compute.manager [req-5b8762ce-2423-4632-a85a-c7bfce415014 req-1382874e-c953-41d3-81de-54d07b5eeb2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received unexpected event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:16:22 np0005539504 nova_compute[187152]: 2025-11-29 07:16:22.931 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:22 np0005539504 NetworkManager[55210]: <info>  [1764400582.9327] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Nov 29 02:16:22 np0005539504 NetworkManager[55210]: <info>  [1764400582.9345] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Nov 29 02:16:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:22.950 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:22.951 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:22.952 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:23Z|00373|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.099 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.365 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.419 187156 DEBUG nova.compute.manager [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.420 187156 DEBUG nova.compute.manager [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing instance network info cache due to event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.420 187156 DEBUG oslo_concurrency.lockutils [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.420 187156 DEBUG oslo_concurrency.lockutils [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:23 np0005539504 nova_compute[187152]: 2025-11-29 07:16:23.421 187156 DEBUG nova.network.neutron [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:16:25 np0005539504 nova_compute[187152]: 2025-11-29 07:16:25.419 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:28 np0005539504 nova_compute[187152]: 2025-11-29 07:16:28.062 187156 DEBUG nova.network.neutron [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updated VIF entry in instance network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:16:28 np0005539504 nova_compute[187152]: 2025-11-29 07:16:28.063 187156 DEBUG nova.network.neutron [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:28 np0005539504 nova_compute[187152]: 2025-11-29 07:16:28.086 187156 DEBUG oslo_concurrency.lockutils [req-64e80104-26db-45c4-9a13-d53e76c4fa39 req-56c69592-8362-4806-90fa-22c0c6656afb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:28 np0005539504 nova_compute[187152]: 2025-11-29 07:16:28.368 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:30 np0005539504 nova_compute[187152]: 2025-11-29 07:16:30.421 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:31 np0005539504 podman[231806]: 2025-11-29 07:16:31.725448724 +0000 UTC m=+0.068728180 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:16:31 np0005539504 podman[231808]: 2025-11-29 07:16:31.754049002 +0000 UTC m=+0.088089553 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:16:31 np0005539504 podman[231807]: 2025-11-29 07:16:31.761094188 +0000 UTC m=+0.097865141 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public)
Nov 29 02:16:33 np0005539504 nova_compute[187152]: 2025-11-29 07:16:33.372 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:34 np0005539504 nova_compute[187152]: 2025-11-29 07:16:34.333 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:34 np0005539504 podman[231874]: 2025-11-29 07:16:34.716950418 +0000 UTC m=+0.057990526 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:16:34 np0005539504 podman[231875]: 2025-11-29 07:16:34.75594013 +0000 UTC m=+0.093986518 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:16:35 np0005539504 nova_compute[187152]: 2025-11-29 07:16:35.438 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:36Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:55:12:21 10.100.0.12
Nov 29 02:16:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:16:36Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:12:21 10.100.0.12
Nov 29 02:16:38 np0005539504 nova_compute[187152]: 2025-11-29 07:16:38.302 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:38.302 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:38.304 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:16:38 np0005539504 nova_compute[187152]: 2025-11-29 07:16:38.373 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:40 np0005539504 nova_compute[187152]: 2025-11-29 07:16:40.441 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:16:43.306 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:43 np0005539504 nova_compute[187152]: 2025-11-29 07:16:43.380 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:43 np0005539504 podman[231927]: 2025-11-29 07:16:43.732844707 +0000 UTC m=+0.076046764 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:16:44 np0005539504 nova_compute[187152]: 2025-11-29 07:16:44.759 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:45 np0005539504 nova_compute[187152]: 2025-11-29 07:16:45.443 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.971 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000067', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'hostId': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.974 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000063', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'hostId': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.975 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.975 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>]
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.981 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 99126e58-be6b-4a8d-bd7e-82d08cc3b61b / tap160dd2b8-54 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.982 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.986 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 531c3d01-115b-479d-bbdc-11e38bc8b0b1 / tapceb086b6-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.986 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb3bd1a5-aabc-4ebb-aafa-c59ded630777', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:47.975985', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e4ed6e2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': 'e172969cf8032567d4adec9478efebe055d23139ae223f0ef5af5275ad0f56f2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:47.975985', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e4f69f4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '8fe21f81ef1c63ed8fabf8be8d80ce84edcf7ebfa552a79626022a9b41669503'}]}, 'timestamp': '2025-11-29 07:16:47.986938', '_unique_id': '941e4b241b914c54a5e1f94e82a212b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.990 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:47.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.003 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.004 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.026 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.028 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df31406c-1d6e-46b4-817e-d3746ef30a6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:47.993281', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e521cee-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.927983872, 'message_signature': '72c4ffb39247692e1368dc3867b538fc3f9e57071716aa92d805a280f0122b52'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:47.993281', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e522f18-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.927983872, 'message_signature': 'f029a57f56814e9435c1642a16979dcdd037c156492f709d74807c4e6820bec3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:47.993281', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e55a396-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.939744372, 'message_signature': '037f78d2ba2638dc4414e5a4de4d4ed61b5f93b505a5144f8219387a917e5772'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:47.993281', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e55c27c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.939744372, 'message_signature': 'a5080043939d47b4c84db7926f14677104f380e3b165b5e37a87ce26fef68032'}]}, 'timestamp': '2025-11-29 07:16:48.028558', '_unique_id': 'edb3c7c69cfc47d29660131a94951b71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.062 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.063 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.090 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.091 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c701bc42-9875-4ac8-a0c7-62cf509b72b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.032106', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e5b1448-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '6912889a9a7c989e4a81152a6be8552bdd2dfa9c5d7af658ad299104f9d9d316'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.032106', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e5b2190-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '2e6d13bc7c8e480d12b3795a9daefa06559c83ad102d35303168d30a10daf880'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.032106', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e5f4a36-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '4b5819b80a66b17f91aec68bf81aaa2935e9d06101aedd11f28b66c8509539b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.032106', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e5f5bfc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '2ed7256278740ea922d12d62497309102a929b477305958e369b5568c8eb1e23'}]}, 'timestamp': '2025-11-29 07:16:48.091422', '_unique_id': '574a63f1d6d74c6ba715367468c9e7b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.093 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '541ece68-8bcd-4777-9680-ee10f57fd626', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.093804', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e5fc754-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': '3463c9c41e1711029236410bd5b1f7ba5d24b8218648379c83a5a42d82521fc6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.093804', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e5fd28a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '116845531d6335d482b92e570f790e6eca5303a9ae3e7c5db1bc87ee4837c242'}]}, 'timestamp': '2025-11-29 07:16:48.094401', '_unique_id': '1520e5a9f2a9405a9fd9043489e08b8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.096 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.incoming.bytes volume: 1642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.096 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.bytes volume: 1598 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d1b0bb4-c2a1-40e6-a31a-3cf5ef640a25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1642, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.096002', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e601dda-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': 'e6d7d53551386f82b398570ffc91d2e24d16b4c0db723222f5da1961589d11ea'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1598, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.096002', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e6028fc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': 'ca9c0a2aa78e52cbeeb389bc4325b4f951f1a44a8ff1f3d491446f9bd8af09d9'}]}, 'timestamp': '2025-11-29 07:16:48.096593', '_unique_id': 'd38f6ca4b12f420cbb2133968b3e82bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.098 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.098 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.098 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.bytes volume: 30517760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.098 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6efe5d0-3e15-4c73-9f97-7e9b6394b1d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.098116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e606f56-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '7686b04b513fe5c1df87a3a169d4a3dd02682123c19e2fbf6cd60f4e94c97f03'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.098116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e607ac8-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '54091d678ad1fa6c7d1037c139314d03954c7586bff95448077b857ec5f0f886'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30517760, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.098116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e608478-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': 'db29be415ff8081336d5174f2d84aa9aedd9b03e9254f5753d3299b42a8bd5c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.098116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e608e28-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': 'c1b8158ee062dab05f1e213b5aad53aa8879365d2417dd9d326e6d301a3c55ca'}]}, 'timestamp': '2025-11-29 07:16:48.099177', '_unique_id': '4b596cfce1084f0ba7a0bd942987b2a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.100 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9825314c-0375-4b0f-9d7e-0be2840f59a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.100773', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e60d6f8-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': 'c956708deea4b04d988a61f9f4a2886d779156e6f56882e21cf362961479fa32'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.100773', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e60e1e8-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '55c758a80579027df136944b59a4eb690ede3bb8e6c7ed998c35e1d08f9138b3'}]}, 'timestamp': '2025-11-29 07:16:48.101328', '_unique_id': 'ad6c97e237da4eec8a4f4d0c0c1e6c8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.102 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.103 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.103 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.103 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee4d7151-0e1c-477d-9e85-6077f970edfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.102858', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e6128a6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': 'cf94559f86f143798f1048d2918818bdba9315a81f52fead2ca4e5ae6ff77124'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.102858', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6132a6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': 'd1f5ee5f83d947e383171f9ae730118c07dc02b5aaf93947f097da2a66cdf158'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.102858', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e613da0-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '93d596c7d0dffbce260aa8c8168f4aab77e4364753c4e73c0a3bda1d5d8911fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.102858', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6146f6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '8543c63946c13562359afca85a8d71a2232a34877f714bb8e2b8ae4621f3322d'}]}, 'timestamp': '2025-11-29 07:16:48.103905', '_unique_id': '3426ef2e6ba046f292ec191c65fb8431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.105 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.105 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce64ef4-72d5-4009-8028-f828568124e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.105466', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e618e40-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': '83aeda0e25b0a961c02ac256f564e7bfae4083a999d99f0420781a74e2a8bd78'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.105466', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e619854-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '10ff6003ab51fababcc7bb6df6f2d6921fb6ba43c160496c2230f2e88c83411a'}]}, 'timestamp': '2025-11-29 07:16:48.105996', '_unique_id': 'c988018554a7471a8bdd832ea9c1ae4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.107 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.107 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.108 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.108 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3131cc33-e4cc-44a1-881f-a5535c935506', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.107539', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e61df3a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.927983872, 'message_signature': 'f11a1188643058952d2a9a228f17a1c3a02d7db19eedfaa8664e18d3fcec39e2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.107539', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e61e8c2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.927983872, 'message_signature': 'bf192a2deb70a197157241c82b760f486c373067bfaa491e777e82590f0cdc27'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.107539', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e61f4f2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.939744372, 'message_signature': 'ed0353ddb0b696c37910ed12a8846c661d22471127c0e551b2fd1debb8cce738'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.107539', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e620000-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.939744372, 'message_signature': '3cfbd2b82bc87074b3cfc79c5c1ff5d22ecbb797d75aaa0a369eec3808098d02'}]}, 'timestamp': '2025-11-29 07:16:48.108644', '_unique_id': 'f3be0f29ba184861be304e6db0fa10b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.110 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.110 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.110 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc7f16b2-b7c6-468f-b648-d08ec701e3d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.110209', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e624754-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '3d77db5b73a1ef2c04135b59897693734bc2b62ed2ee57491aa542622ccc3183'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.110209', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6251d6-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': 'e22bb1d49fe010ea9ec281cf8bd6affe952041db798983b1576da9bf6181fe54'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.110209', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e625c44-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '583bc8853e61e11b0ac142d75986a9cfaef47ed883630135931238a9306c43fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.110209', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6265cc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '9826b39fa050700dd1732c3461f606cb4526d2b68257792c2aced84c49a4ed60'}]}, 'timestamp': '2025-11-29 07:16:48.111248', '_unique_id': 'ca923926e751455ca2007bc3a027d5ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.112 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.112 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>]
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.113 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.write.latency volume: 31456908174 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.113 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.114 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.latency volume: 8874068470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.114 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab22099c-c92c-4787-8923-b8421591568e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31456908174, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.113294', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e62c3fa-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '9f5e3227152d148bf3aa41ca01605c217bc935adc45025c4019e2fbf82b9fee2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.113294', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e62d084-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': 'add9b0d9df00796a7aa8b5f0e34521ced5300f66f72087adfd53cfc7c28683f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8874068470, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.113294', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e62db7e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '636b2316ee53fd4859a86938f60ba21541f575ef8872aec43da87956cf00e277'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.113294', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e62e772-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': 'af2e4daa50884f01785e2b0b33b00246e99c2d942c57f23d0b4aa045c8ec608c'}]}, 'timestamp': '2025-11-29 07:16:48.114617', '_unique_id': '4a0c46e7001447c59c5d0563cf8e9a67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.117 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.117 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>]
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.117 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.118 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18c2d70d-e87c-43c6-844a-1acdcb7f1d24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.117700', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e636ee0-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': '3e597b4300bc111925be6512b458e91566e712136b88f4bdf6023cc9640b808e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.117700', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e637da4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '3536ac34ee55538c0946d7dc87175a5b33b4922d5368044286c21f901f2efd7d'}]}, 'timestamp': '2025-11-29 07:16:48.118494', '_unique_id': '906563e57b9f4dd784b2908f58f24beb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.120 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.121 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76ed0d0c-b423-490e-8c7c-ca741d2a1d7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.120790', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e63e82a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': '33d5ada14264ae9837a7ed65914255e302b3b73ace1e80899264a87c1bb7fce1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.120790', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e63f824-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '7c27d65eb410f8f15ec2ee179ad35f7b420ea1b026e10afa446452e983d7b14e'}]}, 'timestamp': '2025-11-29 07:16:48.121607', '_unique_id': '6c198797bd104e249d6990e125167caf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.124 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.124 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.125 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.125 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2a1695c-f95f-41d3-a712-78c75da6fea1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.124260', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e647100-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.927983872, 'message_signature': 'fa8c339e12a4cb31b26439bce252934db1577069838c042f721336208a3bd0b9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.124260', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6481ea-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.927983872, 'message_signature': 'cc0b6d1c00dbfd2e17ce3f58ea20ec78c4f81b53a486552794c34275f922af3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.124260', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e649022-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.939744372, 'message_signature': '6d1a2053dd5620f3b94445b656b6412eb753fe631b3cee2eb4383b8a7c17de68'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.124260', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e64a1ca-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.939744372, 'message_signature': 'ef1383965097d20e0a3835bf51ba6ef277b084fc3c18352b9d0a84bd1a4779af'}]}, 'timestamp': '2025-11-29 07:16:48.125988', '_unique_id': '31194fa796914d8088a16a3224adbdbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.128 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.129 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f93ef4e0-02a2-4c84-a605-899ac86cbb03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.128862', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e6523a2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': 'a1884658083c48cb7cfba71eab05610c202a9b605fe282fccf133b0771839e52'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.128862', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e653a36-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': 'e3b26d809bd06adae54613d59cc0f4ea4354c37ee987e12e469408e27e69ff3f'}]}, 'timestamp': '2025-11-29 07:16:48.129882', '_unique_id': 'b5905e86aaee427db687621cfdd59812'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.132 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.read.latency volume: 203022990 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.132 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.device.read.latency volume: 32360278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.132 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.latency volume: 214833728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.133 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.latency volume: 23583932 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58a0631b-14eb-4f3a-a78d-3f0967ed8678', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 203022990, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-vda', 'timestamp': '2025-11-29T07:16:48.132117', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e65a066-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': '50607ce5aee6e754de36fdc350140a4f518415ef6ce4852cf614e65fd06c15b9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32360278, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b-sda', 'timestamp': '2025-11-29T07:16:48.132117', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e65b10a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.96682881, 'message_signature': 'f1e24a3f0d0a3090476a8fe02df49d147ece46870a9bec6064647ac6f0414512'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214833728, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:16:48.132117', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e65bfba-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '6b5c5c41efaa342dea67541ce05a26954e4585614f7ec2a11443b2390e51b76b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23583932, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:16:48.132117', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e65ce74-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.998346964, 'message_signature': '9855c3d240742f44da17118d12c3f8aedbaed1908c222628a1bff5983402b93a'}]}, 'timestamp': '2025-11-29 07:16:48.133659', '_unique_id': '2dc04164469d442ab3776d7a789259a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.135 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.135 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-1305320144>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1942369880>]
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.136 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.136 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d71c069-1a82-431c-96e6-3ce1b8a0e185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.136164', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e663d0a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': '1277010cf9cd03c2a0b35e18580bf7229778081fb5b3b3e49fbf878011b8e4b2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.136164', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e664a84-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': 'de2d1938407caf3be7642b5920a5bde8e4b4833fb67cc6515fa7715bcb81ab01'}]}, 'timestamp': '2025-11-29 07:16:48.136822', '_unique_id': '0a5237c5189d43d9b42a7bd80b86f905'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.156 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/memory.usage volume: 42.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.172 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/memory.usage volume: 42.76171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f7477a7-5fc8-4af8-8af1-9b0820973af2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.671875, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'timestamp': '2025-11-29T07:16:48.138689', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5e696cb4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6071.091391237, 'message_signature': '96b0ce91c1d4ecb236dd1112d6be1cff9397d6e020dede4843ad89012247b2ab'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76171875, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'timestamp': '2025-11-29T07:16:48.138689', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5e6bcb76-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6071.107172505, 'message_signature': '7b0ed7360eddbeae3a9f404db43c273af66e5239d82c164295bcd602bbf43ed8'}]}, 'timestamp': '2025-11-29 07:16:48.172863', '_unique_id': '1aa30d4eac59469a97e6a600cef11d6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.174 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/cpu volume: 12190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.174 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/cpu volume: 12600000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '711b8342-8b2f-4017-a343-9403be6cec1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12190000000, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'timestamp': '2025-11-29T07:16:48.174631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'instance-00000067', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5e6c1a2c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6071.091391237, 'message_signature': '32dd69344af0f8a7e657e89911098863a0e9844d64d9e9b07d08f7a568a2ca8f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12600000000, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'timestamp': '2025-11-29T07:16:48.174631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5e6c235a-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6071.107172505, 'message_signature': '8acbf8506d0b743d829e83de62d153b00347d2a7168926151a298b6947515c10'}]}, 'timestamp': '2025-11-29 07:16:48.175094', '_unique_id': 'b354de6bf3c94db0bff60dd01b46782e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.176 12 DEBUG ceilometer.compute.pollsters [-] 99126e58-be6b-4a8d-bd7e-82d08cc3b61b/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72800c37-ebfb-4999-bd4d-f290cd128b16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000067-99126e58-be6b-4a8d-bd7e-82d08cc3b61b-tap160dd2b8-54', 'timestamp': '2025-11-29T07:16:48.176760', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1305320144', 'name': 'tap160dd2b8-54', 'instance_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:55:12:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap160dd2b8-54'}, 'message_id': '5e6c6fa4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.910654403, 'message_signature': 'f2e143b17b24082d9a98a67989fa13402f46243d2e4a43a9ec3c5824b79c3a16'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:16:48.176760', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': '5e6c7c38-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6070.918355187, 'message_signature': '470ca290548640946392422588faae4f5e5aebe60acbc0f73449a4c927084941'}]}, 'timestamp': '2025-11-29 07:16:48.177393', '_unique_id': '176f36c77382466d99ab3b4a77ab31e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:16:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539504 nova_compute[187152]: 2025-11-29 07:16:48.382 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:49 np0005539504 podman[231948]: 2025-11-29 07:16:49.724597458 +0000 UTC m=+0.068998058 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:16:50 np0005539504 nova_compute[187152]: 2025-11-29 07:16:50.475 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:51 np0005539504 nova_compute[187152]: 2025-11-29 07:16:51.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:53 np0005539504 nova_compute[187152]: 2025-11-29 07:16:53.386 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:55 np0005539504 nova_compute[187152]: 2025-11-29 07:16:55.478 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:58 np0005539504 nova_compute[187152]: 2025-11-29 07:16:58.390 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:58 np0005539504 nova_compute[187152]: 2025-11-29 07:16:58.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:00 np0005539504 nova_compute[187152]: 2025-11-29 07:17:00.482 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:00 np0005539504 nova_compute[187152]: 2025-11-29 07:17:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:01 np0005539504 nova_compute[187152]: 2025-11-29 07:17:01.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:01 np0005539504 nova_compute[187152]: 2025-11-29 07:17:01.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:02 np0005539504 podman[231971]: 2025-11-29 07:17:02.745309622 +0000 UTC m=+0.058374767 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 02:17:02 np0005539504 podman[231970]: 2025-11-29 07:17:02.745503027 +0000 UTC m=+0.070300902 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:17:02 np0005539504 podman[231969]: 2025-11-29 07:17:02.765257229 +0000 UTC m=+0.090069784 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:17:03 np0005539504 nova_compute[187152]: 2025-11-29 07:17:03.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539504 nova_compute[187152]: 2025-11-29 07:17:03.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:03 np0005539504 nova_compute[187152]: 2025-11-29 07:17:03.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:17:04 np0005539504 nova_compute[187152]: 2025-11-29 07:17:04.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:05 np0005539504 nova_compute[187152]: 2025-11-29 07:17:05.484 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539504 podman[232034]: 2025-11-29 07:17:05.720822693 +0000 UTC m=+0.062190477 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:17:05 np0005539504 podman[232035]: 2025-11-29 07:17:05.765456355 +0000 UTC m=+0.098313723 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:17:08 np0005539504 nova_compute[187152]: 2025-11-29 07:17:08.396 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:08 np0005539504 nova_compute[187152]: 2025-11-29 07:17:08.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:08 np0005539504 nova_compute[187152]: 2025-11-29 07:17:08.991 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:08 np0005539504 nova_compute[187152]: 2025-11-29 07:17:08.992 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:08 np0005539504 nova_compute[187152]: 2025-11-29 07:17:08.992 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:08 np0005539504 nova_compute[187152]: 2025-11-29 07:17:08.992 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.109 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.167 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.168 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.222 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.227 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.279 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.280 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.335 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.529 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.531 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5420MB free_disk=73.13503646850586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.531 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:09 np0005539504 nova_compute[187152]: 2025-11-29 07:17:09.531 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.392 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.392 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 99126e58-be6b-4a8d-bd7e-82d08cc3b61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.393 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.393 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.485 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.576 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.683 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:17:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:17:10Z|00374|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.825 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.841 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:17:10 np0005539504 nova_compute[187152]: 2025-11-29 07:17:10.841 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:12 np0005539504 nova_compute[187152]: 2025-11-29 07:17:12.843 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:12 np0005539504 nova_compute[187152]: 2025-11-29 07:17:12.843 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:17:12 np0005539504 nova_compute[187152]: 2025-11-29 07:17:12.844 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:17:13 np0005539504 nova_compute[187152]: 2025-11-29 07:17:13.400 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:14 np0005539504 podman[232099]: 2025-11-29 07:17:14.737589715 +0000 UTC m=+0.076834705 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:17:15 np0005539504 nova_compute[187152]: 2025-11-29 07:17:15.488 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:15 np0005539504 nova_compute[187152]: 2025-11-29 07:17:15.848 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:17:15 np0005539504 nova_compute[187152]: 2025-11-29 07:17:15.848 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:15 np0005539504 nova_compute[187152]: 2025-11-29 07:17:15.848 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:17:15 np0005539504 nova_compute[187152]: 2025-11-29 07:17:15.849 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 531c3d01-115b-479d-bbdc-11e38bc8b0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:18 np0005539504 nova_compute[187152]: 2025-11-29 07:17:18.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:18 np0005539504 nova_compute[187152]: 2025-11-29 07:17:18.472 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:18 np0005539504 nova_compute[187152]: 2025-11-29 07:17:18.491 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:18 np0005539504 nova_compute[187152]: 2025-11-29 07:17:18.491 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:17:18 np0005539504 nova_compute[187152]: 2025-11-29 07:17:18.492 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:19 np0005539504 nova_compute[187152]: 2025-11-29 07:17:19.081 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:19 np0005539504 nova_compute[187152]: 2025-11-29 07:17:19.082 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:17:20 np0005539504 nova_compute[187152]: 2025-11-29 07:17:20.489 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:20 np0005539504 podman[232132]: 2025-11-29 07:17:20.70538972 +0000 UTC m=+0.054174874 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:17:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:22.951 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:22.952 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:22.954 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:23 np0005539504 nova_compute[187152]: 2025-11-29 07:17:23.408 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:23 np0005539504 nova_compute[187152]: 2025-11-29 07:17:23.962 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:23.963 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:23.964 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:17:25 np0005539504 nova_compute[187152]: 2025-11-29 07:17:25.491 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:26 np0005539504 nova_compute[187152]: 2025-11-29 07:17:26.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:27 np0005539504 nova_compute[187152]: 2025-11-29 07:17:27.261 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:27 np0005539504 nova_compute[187152]: 2025-11-29 07:17:27.262 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:17:27 np0005539504 nova_compute[187152]: 2025-11-29 07:17:27.282 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:17:28 np0005539504 nova_compute[187152]: 2025-11-29 07:17:28.412 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:28.967 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:30 np0005539504 nova_compute[187152]: 2025-11-29 07:17:30.545 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:33 np0005539504 nova_compute[187152]: 2025-11-29 07:17:33.415 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:33 np0005539504 podman[232153]: 2025-11-29 07:17:33.750643163 +0000 UTC m=+0.074675027 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:17:33 np0005539504 podman[232154]: 2025-11-29 07:17:33.750960072 +0000 UTC m=+0.075550070 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 29 02:17:33 np0005539504 podman[232155]: 2025-11-29 07:17:33.762676712 +0000 UTC m=+0.078649572 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 02:17:35 np0005539504 nova_compute[187152]: 2025-11-29 07:17:35.547 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:36 np0005539504 podman[232220]: 2025-11-29 07:17:36.722517459 +0000 UTC m=+0.061974471 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:17:36 np0005539504 podman[232221]: 2025-11-29 07:17:36.761741677 +0000 UTC m=+0.092284534 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:17:38 np0005539504 nova_compute[187152]: 2025-11-29 07:17:38.418 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:40 np0005539504 nova_compute[187152]: 2025-11-29 07:17:40.549 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:41 np0005539504 nova_compute[187152]: 2025-11-29 07:17:41.260 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:43 np0005539504 nova_compute[187152]: 2025-11-29 07:17:43.422 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:45 np0005539504 nova_compute[187152]: 2025-11-29 07:17:45.551 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:45 np0005539504 podman[232271]: 2025-11-29 07:17:45.712008079 +0000 UTC m=+0.057814071 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:17:48 np0005539504 nova_compute[187152]: 2025-11-29 07:17:48.427 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:48 np0005539504 nova_compute[187152]: 2025-11-29 07:17:48.565 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:48 np0005539504 nova_compute[187152]: 2025-11-29 07:17:48.566 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:48 np0005539504 nova_compute[187152]: 2025-11-29 07:17:48.566 187156 INFO nova.compute.manager [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Shelving#033[00m
Nov 29 02:17:48 np0005539504 nova_compute[187152]: 2025-11-29 07:17:48.917 187156 DEBUG nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:17:50 np0005539504 nova_compute[187152]: 2025-11-29 07:17:50.552 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 kernel: tap160dd2b8-54 (unregistering): left promiscuous mode
Nov 29 02:17:51 np0005539504 NetworkManager[55210]: <info>  [1764400671.4810] device (tap160dd2b8-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.487 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:17:51Z|00375|binding|INFO|Releasing lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 from this chassis (sb_readonly=0)
Nov 29 02:17:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:17:51Z|00376|binding|INFO|Setting lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 down in Southbound
Nov 29 02:17:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:17:51Z|00377|binding|INFO|Removing iface tap160dd2b8-54 ovn-installed in OVS
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.489 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.504 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000067.scope: Deactivated successfully.
Nov 29 02:17:51 np0005539504 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000067.scope: Consumed 17.123s CPU time.
Nov 29 02:17:51 np0005539504 systemd-machined[153423]: Machine qemu-51-instance-00000067 terminated.
Nov 29 02:17:51 np0005539504 podman[232293]: 2025-11-29 07:17:51.588795426 +0000 UTC m=+0.077980276 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.662 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:12:21 10.100.0.12'], port_security=['fa:16:3e:55:12:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=160dd2b8-54e7-490c-8d0e-b15f57edcc04) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.665 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 unbound from our chassis#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.668 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.693 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[85ad8e22-60b9-4be3-97bd-d5f86d9b1677]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539504 kernel: tap160dd2b8-54: entered promiscuous mode
Nov 29 02:17:51 np0005539504 kernel: tap160dd2b8-54 (unregistering): left promiscuous mode
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.730 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.739 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5f8027-abad-4344-a783-9a8f511f3974]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.744 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[edea5a99-b64a-4a9a-995a-08a48266b69e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.774 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[229238c1-816f-4578-816b-743e3bb8dfb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.791 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5bac83f4-16fc-4e21-9072-19c85a19b04d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583826, 'reachable_time': 26786, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232339, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.806 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f89fb3-a038-4485-b6fc-2f0656ac55ec]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583837, 'tstamp': 583837}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232340, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf7cfc35-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 583840, 'tstamp': 583840}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232340, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.808 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.810 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.815 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.815 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.816 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:17:51.816 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.940 187156 INFO nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.946 187156 INFO nova.virt.libvirt.driver [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance destroyed successfully.#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.947 187156 DEBUG nova.objects.instance [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'numa_topology' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:51 np0005539504 nova_compute[187152]: 2025-11-29 07:17:51.957 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:52 np0005539504 nova_compute[187152]: 2025-11-29 07:17:52.365 187156 INFO nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Beginning cold snapshot process#033[00m
Nov 29 02:17:53 np0005539504 nova_compute[187152]: 2025-11-29 07:17:53.429 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:54 np0005539504 nova_compute[187152]: 2025-11-29 07:17:54.951 187156 DEBUG nova.privsep.utils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:17:54 np0005539504 nova_compute[187152]: 2025-11-29 07:17:54.951 187156 DEBUG oslo_concurrency.processutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk /var/lib/nova/instances/snapshots/tmp2x12jmu9/1bcc3aab9c764013a612d7751df9a83e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:55 np0005539504 nova_compute[187152]: 2025-11-29 07:17:55.554 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:56 np0005539504 nova_compute[187152]: 2025-11-29 07:17:56.815 187156 DEBUG oslo_concurrency.processutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk /var/lib/nova/instances/snapshots/tmp2x12jmu9/1bcc3aab9c764013a612d7751df9a83e" returned: 0 in 1.864s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:56 np0005539504 nova_compute[187152]: 2025-11-29 07:17:56.817 187156 INFO nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.384 187156 DEBUG nova.compute.manager [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-unplugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.386 187156 DEBUG oslo_concurrency.lockutils [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.387 187156 DEBUG oslo_concurrency.lockutils [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.388 187156 DEBUG oslo_concurrency.lockutils [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.388 187156 DEBUG nova.compute.manager [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] No waiting events found dispatching network-vif-unplugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.388 187156 WARNING nova.compute.manager [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received unexpected event network-vif-unplugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.388 187156 DEBUG nova.compute.manager [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.389 187156 DEBUG oslo_concurrency.lockutils [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.389 187156 DEBUG oslo_concurrency.lockutils [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.389 187156 DEBUG oslo_concurrency.lockutils [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.389 187156 DEBUG nova.compute.manager [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] No waiting events found dispatching network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.389 187156 WARNING nova.compute.manager [req-08ff218b-2301-47a3-8ee3-d8159819dff5 req-c52917fe-a371-462a-979c-84f927fbf536 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received unexpected event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Nov 29 02:17:58 np0005539504 nova_compute[187152]: 2025-11-29 07:17:58.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:00 np0005539504 nova_compute[187152]: 2025-11-29 07:18:00.555 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:00 np0005539504 nova_compute[187152]: 2025-11-29 07:18:00.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:01 np0005539504 nova_compute[187152]: 2025-11-29 07:18:01.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:02 np0005539504 nova_compute[187152]: 2025-11-29 07:18:02.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:03 np0005539504 nova_compute[187152]: 2025-11-29 07:18:03.436 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:03.522 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:18:03 np0005539504 nova_compute[187152]: 2025-11-29 07:18:03.523 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:03.523 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:18:03 np0005539504 nova_compute[187152]: 2025-11-29 07:18:03.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:03 np0005539504 nova_compute[187152]: 2025-11-29 07:18:03.975 187156 INFO nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Snapshot image upload complete#033[00m
Nov 29 02:18:03 np0005539504 nova_compute[187152]: 2025-11-29 07:18:03.976 187156 DEBUG nova.compute.manager [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:04 np0005539504 podman[232352]: 2025-11-29 07:18:04.708201842 +0000 UTC m=+0.054516904 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:18:04 np0005539504 podman[232354]: 2025-11-29 07:18:04.708201392 +0000 UTC m=+0.047567430 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:18:04 np0005539504 podman[232353]: 2025-11-29 07:18:04.7391038 +0000 UTC m=+0.082312060 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:18:04 np0005539504 nova_compute[187152]: 2025-11-29 07:18:04.824 187156 INFO nova.compute.manager [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Shelve offloading#033[00m
Nov 29 02:18:04 np0005539504 nova_compute[187152]: 2025-11-29 07:18:04.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:04 np0005539504 nova_compute[187152]: 2025-11-29 07:18:04.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.091 187156 INFO nova.virt.libvirt.driver [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance destroyed successfully.#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.091 187156 DEBUG nova.compute.manager [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.094 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.095 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.095 187156 DEBUG nova.network.neutron [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.556 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:05 np0005539504 nova_compute[187152]: 2025-11-29 07:18:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:06 np0005539504 nova_compute[187152]: 2025-11-29 07:18:06.767 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400671.7654846, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:18:06 np0005539504 nova_compute[187152]: 2025-11-29 07:18:06.768 187156 INFO nova.compute.manager [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:18:06 np0005539504 nova_compute[187152]: 2025-11-29 07:18:06.882 187156 DEBUG nova.compute.manager [None req-a85f8d7f-e57d-4f22-a698-233a0a9a8c4e - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:06 np0005539504 nova_compute[187152]: 2025-11-29 07:18:06.885 187156 DEBUG nova.compute.manager [None req-a85f8d7f-e57d-4f22-a698-233a0a9a8c4e - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:18:07 np0005539504 nova_compute[187152]: 2025-11-29 07:18:07.303 187156 INFO nova.compute.manager [None req-a85f8d7f-e57d-4f22-a698-233a0a9a8c4e - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Nov 29 02:18:07 np0005539504 podman[232414]: 2025-11-29 07:18:07.745132399 +0000 UTC m=+0.079504946 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:18:07 np0005539504 podman[232415]: 2025-11-29 07:18:07.784361297 +0000 UTC m=+0.115858378 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:18:08 np0005539504 nova_compute[187152]: 2025-11-29 07:18:08.438 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:08 np0005539504 nova_compute[187152]: 2025-11-29 07:18:08.515 187156 DEBUG nova.network.neutron [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:09.527 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:09 np0005539504 nova_compute[187152]: 2025-11-29 07:18:09.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:10 np0005539504 nova_compute[187152]: 2025-11-29 07:18:10.253 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:10 np0005539504 nova_compute[187152]: 2025-11-29 07:18:10.559 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:10 np0005539504 nova_compute[187152]: 2025-11-29 07:18:10.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:10 np0005539504 nova_compute[187152]: 2025-11-29 07:18:10.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:18:12 np0005539504 nova_compute[187152]: 2025-11-29 07:18:12.017 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:12 np0005539504 nova_compute[187152]: 2025-11-29 07:18:12.018 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:12 np0005539504 nova_compute[187152]: 2025-11-29 07:18:12.018 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:18:13 np0005539504 nova_compute[187152]: 2025-11-29 07:18:13.443 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:15 np0005539504 nova_compute[187152]: 2025-11-29 07:18:15.560 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:16 np0005539504 nova_compute[187152]: 2025-11-29 07:18:16.205 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:16 np0005539504 nova_compute[187152]: 2025-11-29 07:18:16.205 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:16 np0005539504 podman[232464]: 2025-11-29 07:18:16.731244107 +0000 UTC m=+0.073142956 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:18:17 np0005539504 nova_compute[187152]: 2025-11-29 07:18:17.180 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:18:18 np0005539504 nova_compute[187152]: 2025-11-29 07:18:18.447 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.211 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.212 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.219 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.219 187156 INFO nova.compute.claims [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.487 187156 DEBUG nova.compute.provider_tree [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.556 187156 DEBUG nova.scheduler.client.report [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.841 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:19 np0005539504 nova_compute[187152]: 2025-11-29 07:18:19.842 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.072 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.213 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.213 187156 DEBUG nova.network.neutron [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.237 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.237 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.238 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.239 187156 INFO nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.269 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.269 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.270 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.270 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.271 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.562 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.693 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.772 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.773 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.838 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.844 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.912 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.913 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.955 187156 INFO nova.virt.libvirt.driver [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance destroyed successfully.#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.956 187156 DEBUG nova.objects.instance [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.974 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.990 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.991 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.992 187156 INFO nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Creating image(s)#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.992 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.993 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:20 np0005539504 nova_compute[187152]: 2025-11-29 07:18:20.993 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.008 187156 DEBUG nova.policy [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.011 187156 DEBUG nova.virt.libvirt.vif [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305320144',display_name='tempest-ServerActionsTestOtherB-server-1305320144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305320144',id=103,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-18bazq8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member',shelved_at='2025-11-29T07:18:03.976116',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='49c6be0d-bc35-4118-ad98-05409ab0466a'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=99126e58-be6b-4a8d-bd7e-82d08cc3b61b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.012 187156 DEBUG nova.network.os_vif_util [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.013 187156 DEBUG nova.network.os_vif_util [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.013 187156 DEBUG os_vif [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.015 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.016 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap160dd2b8-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.017 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.045 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.048 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.052 187156 INFO os_vif [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54')#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.053 187156 INFO nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Deleting instance files /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b_del#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.059 187156 INFO nova.virt.libvirt.driver [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Deletion of /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b_del complete#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.082 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.082 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.083 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.093 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.189 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.190 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.283 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.285 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5545MB free_disk=73.13509368896484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.286 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.286 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.287 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk 1073741824" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.288 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.288 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.342 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.343 187156 DEBUG nova.virt.disk.api [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Checking if we can resize image /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.344 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.390 187156 DEBUG nova.compute.manager [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.391 187156 DEBUG nova.compute.manager [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing instance network info cache due to event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.391 187156 DEBUG oslo_concurrency.lockutils [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.392 187156 DEBUG oslo_concurrency.lockutils [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.392 187156 DEBUG nova.network.neutron [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.400 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.401 187156 DEBUG nova.virt.disk.api [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Cannot resize image /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.401 187156 DEBUG nova.objects.instance [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'migration_context' on Instance uuid d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.433 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.434 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Ensure instance console log exists: /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.434 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.434 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.435 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:21 np0005539504 podman[232512]: 2025-11-29 07:18:21.714164623 +0000 UTC m=+0.061334664 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.742 187156 INFO nova.scheduler.client.report [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Deleted allocations for instance 99126e58-be6b-4a8d-bd7e-82d08cc3b61b#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.808 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.809 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.809 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.809 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.936 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.953 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:21 np0005539504 nova_compute[187152]: 2025-11-29 07:18:21.957 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.106 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.106 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.108 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.195 187156 DEBUG nova.compute.provider_tree [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.257 187156 DEBUG nova.scheduler.client.report [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.289 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:22 np0005539504 nova_compute[187152]: 2025-11-29 07:18:22.822 187156 DEBUG oslo_concurrency.lockutils [None req-082cfb58-8fad-4552-9741-e24c41ccd77c ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 34.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:22.952 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:22.953 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:22.953 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:25 np0005539504 nova_compute[187152]: 2025-11-29 07:18:25.611 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:26 np0005539504 nova_compute[187152]: 2025-11-29 07:18:26.045 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:30 np0005539504 nova_compute[187152]: 2025-11-29 07:18:30.031 187156 DEBUG nova.network.neutron [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Successfully created port: c67ec97e-5456-48e8-9e98-b9075cc0b2aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:18:30 np0005539504 nova_compute[187152]: 2025-11-29 07:18:30.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:31 np0005539504 nova_compute[187152]: 2025-11-29 07:18:31.048 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:31 np0005539504 nova_compute[187152]: 2025-11-29 07:18:31.066 187156 DEBUG nova.network.neutron [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updated VIF entry in instance network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:18:31 np0005539504 nova_compute[187152]: 2025-11-29 07:18:31.066 187156 DEBUG nova.network.neutron [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap160dd2b8-54", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:35 np0005539504 nova_compute[187152]: 2025-11-29 07:18:35.681 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:35 np0005539504 podman[232533]: 2025-11-29 07:18:35.787058668 +0000 UTC m=+0.081832418 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public)
Nov 29 02:18:35 np0005539504 podman[232532]: 2025-11-29 07:18:35.798553172 +0000 UTC m=+0.092517050 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:18:35 np0005539504 podman[232534]: 2025-11-29 07:18:35.809348998 +0000 UTC m=+0.088374191 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 02:18:36 np0005539504 nova_compute[187152]: 2025-11-29 07:18:36.055 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:38 np0005539504 podman[232593]: 2025-11-29 07:18:38.77697595 +0000 UTC m=+0.105059432 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:18:38 np0005539504 podman[232594]: 2025-11-29 07:18:38.789192574 +0000 UTC m=+0.112715045 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 29 02:18:39 np0005539504 nova_compute[187152]: 2025-11-29 07:18:39.825 187156 DEBUG oslo_concurrency.lockutils [req-40c59210-0828-488d-b701-eae6035580e6 req-2ff412e9-cb48-4a4f-acf7-aec473e3e2a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:40 np0005539504 nova_compute[187152]: 2025-11-29 07:18:40.683 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:41 np0005539504 nova_compute[187152]: 2025-11-29 07:18:41.057 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:45 np0005539504 nova_compute[187152]: 2025-11-29 07:18:45.684 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.059 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.203 187156 DEBUG nova.network.neutron [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Successfully updated port: c67ec97e-5456-48e8-9e98-b9075cc0b2aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.317 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.317 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.317 187156 DEBUG nova.network.neutron [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.341 187156 DEBUG nova.compute.manager [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.342 187156 DEBUG nova.compute.manager [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing instance network info cache due to event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.342 187156 DEBUG oslo_concurrency.lockutils [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:46 np0005539504 nova_compute[187152]: 2025-11-29 07:18:46.987 187156 DEBUG nova.network.neutron [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:18:47 np0005539504 podman[232639]: 2025-11-29 07:18:47.721788297 +0000 UTC m=+0.067771035 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:18:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:47.974 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000063', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'hostId': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:18:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:47.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.004 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.006 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '432641b2-06f4-4fef-9204-8097b57ad1eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:47.977304', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5d8dcd8-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'f0a4706d16d9a646b17692ca22b3f632605238d81ff8ec05be9e321d88df1801'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:47.977304', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5d8fe5c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': '9d9299598ddc76a25e2c5623ab7d315781983b63e8259841781244ad1b26cbb4'}]}, 'timestamp': '2025-11-29 07:18:48.006855', '_unique_id': '55ffd964bc5a47bebc1096a94053a72d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.013 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.014 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11ce18eb-1a30-4e54-b8a7-eab2bcd75d2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.013663', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5da1c7e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'b4d955c861962b1c8798ffea869ca2ddfaad5f174146e4fb499c47c4228466bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.013663', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5da2ee4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'bbc020d83dc69f887152f06194f1b474778c209c3542be04a80a456d328eb73c'}]}, 'timestamp': '2025-11-29 07:18:48.014618', '_unique_id': '02a76ec9cbdd43d085f0f584f486c2f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.020 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a84dd969-b86e-4e4b-98a0-cd4607baef94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.017197', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5db2146-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '26e3a1103369a63f34f38c312808ece65476a3078f60b3e764e5b0d4b3fa5532'}]}, 'timestamp': '2025-11-29 07:18:48.020827', '_unique_id': '446b5e0882d24defb25e10f813a379e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.021 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.023 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.latency volume: 8874068470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.023 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5132609e-8cfb-46e3-968c-7f033b50f9f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8874068470, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.023426', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5db9928-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'f11e0458ae14346be73e2fcac5104ca13ec227e731b7076459623d432f1f8e28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.023426', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5dba788-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': '1255b8ee219dc583f4f38a034a7ad34c8ce7bf2df566f905ce4fec1724b3850c'}]}, 'timestamp': '2025-11-29 07:18:48.024199', '_unique_id': '46e35bd9316042a79079d0ed3d9e7b5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.026 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2f78965-82c9-4953-8e87-519c6037467a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.026704', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5dc1880-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '13602a2ab28109ccc45cc6b008efec9855ea3fc0a4a18a5d190171e46ab5fc69'}]}, 'timestamp': '2025-11-29 07:18:48.027109', '_unique_id': 'f2841d9403f64240a634736e4f54ff60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.029 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.029 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fe66fa0-048e-4c03-a9e2-54f1007c5060', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.029592', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5dc8b12-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': 'd030ecdc059e957198a2b27627d91d3bf306dded1762dc90ecc18c224a403f4b'}]}, 'timestamp': '2025-11-29 07:18:48.030043', '_unique_id': '7c56ea70a1164ab9b2e2820d87c8b572'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.032 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.bytes volume: 1682 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '058dd594-2335-417a-b750-36bd5138a60d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1682, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.032649', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5dd025e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '82071b31c98299218321778a345e842463d0acc014938c4996df0c2565a3b1dc'}]}, 'timestamp': '2025-11-29 07:18:48.033143', '_unique_id': 'ca4944a2f6fb454092f5b91f75675c7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.034 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.035 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.050 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/memory.usage volume: 42.76171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bf932a1-c817-410c-9332-7077dce2614d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76171875, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'timestamp': '2025-11-29T07:18:48.036065', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a5dfd416-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.984981823, 'message_signature': '75850f16abe83a464113000c82f96f6a84b1f93a060f10a347800f305f37f797'}]}, 'timestamp': '2025-11-29 07:18:48.051683', '_unique_id': '800995148d684f5db73c9260080fd725'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.054 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.055 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '390ad370-c4a5-4abb-af76-c9a291ca639c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.054734', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e05e18-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'e0e8df7aaa36d8aeceeb1db05d92585f4f0b6b199e1e5b445efd849e5d1da7cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.054734', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e069ee-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': '683c175b6835de017486e48afd949b5e24ca419f2a2a734abea2953f7ae30c6d'}]}, 'timestamp': '2025-11-29 07:18:48.055350', '_unique_id': '1715d3f2959d4c39bbf401360fea95c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.057 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.057 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/cpu volume: 13230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '575a850a-da35-44c3-a9d4-c34736c52a7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13230000000, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'timestamp': '2025-11-29T07:18:48.057207', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a5e0c088-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.984981823, 'message_signature': '9557202f25014c3a8905a1674141d71457b6937aefb4278da7c717c5a0dd7a65'}]}, 'timestamp': '2025-11-29 07:18:48.057587', '_unique_id': '3d6ebb8828c0425ea8e6253062101edb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.059 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.bytes volume: 30517760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.059 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fce90f88-dfb2-4b8e-90b9-c85143a70c04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30517760, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.059203', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e10c3c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': '69dbd93c48f230abd2a39d0abde4a20759c5500afdbc0abab56aab9b9eb68c66'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.059203', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e11880-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': '1cd86ad54381fc75f731e60beef4069fe976b472909fb88bb61d56fdd02b9762'}]}, 'timestamp': '2025-11-29 07:18:48.059850', '_unique_id': '5c95fd41615b486091c7ab60e14bec11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.061 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.packets volume: 18 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cbbf306-518c-43ca-9bc0-4b12e4f70dc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 18, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.061932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5e176cc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '9cb8945d99c960cda90aa6466287f036a2fb083eaa4ed1c267b4827db285b6ea'}]}, 'timestamp': '2025-11-29 07:18:48.062251', '_unique_id': 'e34c406ad20f4d63b204f31999d08389'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.064 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.064 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.bytes.delta volume: 84 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '148a9bbb-421d-4fca-84d5-79776046124d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 84, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.064281', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5e1d3ce-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '5fee7d2b2fc0be5925b429683b877494674f41ea0cfd03756291fd154cb6dbb7'}]}, 'timestamp': '2025-11-29 07:18:48.064689', '_unique_id': 'a18660e97d3942c5a439b9c3e261e152'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.067 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ff85ad5-83e2-4b1a-b5b3-9e470879bb13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.066971', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5e23ce2-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '6b6b096dd46da91dea690c65083b9fa84eaf33a6e73286fc1c5d7189e1393267'}]}, 'timestamp': '2025-11-29 07:18:48.067365', '_unique_id': 'e9dfab5e14a74794ad2ebd6090fe39d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.070 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.070 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.latency volume: 214833728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.070 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.read.latency volume: 23583932 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fda4fa75-4579-4328-94f3-a0a9d491906a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214833728, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.070251', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e2bf82-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'f950dd0d93029ec1bb53b982b414d4f25e01df8b17a5c3ca11ec1779f03286ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23583932, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.070251', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e2ce82-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.912060133, 'message_signature': 'cab79d018db602af17d019ce68c205c3402e4d9cf36f6756636d95b9dca5aef5'}]}, 'timestamp': '2025-11-29 07:18:48.071064', '_unique_id': '28c1eb844de14df7aba18f9a76b7c2d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.088 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.089 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f14fa119-5e24-43fa-b381-7a990b06a2ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.074699', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e59f2c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6191.009474622, 'message_signature': 'cd2c6f6b6c40724386b3061b903e1c257d34eeb780670e797516a369b2258147'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.074699', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e5b0fc-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6191.009474622, 'message_signature': '0bf3d057588f9ee2d081a2e88d08ced73038d89888efcc8eaf1512013122d039'}]}, 'timestamp': '2025-11-29 07:18:48.089951', '_unique_id': '2ae6e3445d824b0c8c1799ab6c0b0749'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.092 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cca43aa8-52f0-4d42-afa4-d09604a9c1e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.092488', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5e62140-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '769c392b41eb124750ae181870bbf103a9a9885c048cfc02d43566e04deeee45'}]}, 'timestamp': '2025-11-29 07:18:48.092887', '_unique_id': '488c9d11597e47ba84d48c0f3cfb1c1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.095 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.095 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6b39cd4-429b-4503-a0ca-744f6a399b04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.095063', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e6850e-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6191.009474622, 'message_signature': '1c399c613f278d9fe6b0adadb66a26513bb91bc0b1eba5d0f493f3b4b3d57c43'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.095063', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e690da-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6191.009474622, 'message_signature': 'ff35eda6480787d6f845f0ee66f3bd333361d647cd6b47a784d9a1da21f70949'}]}, 'timestamp': '2025-11-29 07:18:48.095665', '_unique_id': '17ebb1367234438d920b2af05df56984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.097 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.097 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a3e3e8f-dfdd-4cfb-9e78-71091fe4c055', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-vda', 'timestamp': '2025-11-29T07:18:48.097623', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e6e8be-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6191.009474622, 'message_signature': 'ed9806c4e5687f0de4bb02c7b9f56f2efdcc695c81171c7aea1e9491b149b2ca'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1-sda', 'timestamp': '2025-11-29T07:18:48.097623', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'instance-00000063', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e6f3f4-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6191.009474622, 'message_signature': '50bd254103635b0aab82c53fca521dcd288455ee311a7fdd84b1f8469b50e154'}]}, 'timestamp': '2025-11-29 07:18:48.098223', '_unique_id': '7d4daadce1b14a8192798ab8b6ddd7dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.100 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9b9ab88-27fb-49b2-b9d6-1360d1204c14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.100188', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5e74f5c-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '2c86f9eb1eb2f37c4f59a6fc55732eb6c25862a739851ea472d6dae886da9de1'}]}, 'timestamp': '2025-11-29 07:18:48.100567', '_unique_id': '3d2cea486e524c6d9c9f0932fea98ddd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.102 12 DEBUG ceilometer.compute.pollsters [-] 531c3d01-115b-479d-bbdc-11e38bc8b0b1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '328fd5d9-b946-4176-9606-e85f555995fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_name': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_name': None, 'resource_id': 'instance-00000063-531c3d01-115b-479d-bbdc-11e38bc8b0b1-tapceb086b6-5a', 'timestamp': '2025-11-29T07:18:48.102573', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1942369880', 'name': 'tapceb086b6-5a', 'instance_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'instance_type': 'm1.nano', 'host': '619c2ffc4fd201633529869bab3ba4b68880fe20d9fd36ef8048a871', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:f8:4b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapceb086b6-5a'}, 'message_id': 'a5e7a998-ccf3-11f0-8a11-fa163ea726b4', 'monotonic_time': 6190.951908549, 'message_signature': '03013c055d9e0ff755101d61e33eab54a754b90d30ccc66bd775d5d5338702a1'}]}, 'timestamp': '2025-11-29 07:18:48.102871', '_unique_id': '83cea201bf704c71ad57a3ba2fdb7b30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.299 187156 DEBUG nova.network.neutron [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.554 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.555 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Instance network_info: |[{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.556 187156 DEBUG oslo_concurrency.lockutils [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.556 187156 DEBUG nova.network.neutron [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.562 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Start _get_guest_xml network_info=[{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.568 187156 WARNING nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.575 187156 DEBUG nova.virt.libvirt.host [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.576 187156 DEBUG nova.virt.libvirt.host [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.579 187156 DEBUG nova.virt.libvirt.host [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.580 187156 DEBUG nova.virt.libvirt.host [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.581 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.581 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.581 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.582 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.582 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.582 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.582 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.583 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.583 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.583 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.583 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.584 187156 DEBUG nova.virt.hardware [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.587 187156 DEBUG nova.virt.libvirt.vif [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:18:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.588 187156 DEBUG nova.network.os_vif_util [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.588 187156 DEBUG nova.network.os_vif_util [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:18:48 np0005539504 nova_compute[187152]: 2025-11-29 07:18:48.589 187156 DEBUG nova.objects.instance [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_devices' on Instance uuid d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.230 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <uuid>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</uuid>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <name>instance-0000006d</name>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:name>tempest-tempest.common.compute-instance-1725835497</nova:name>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:18:48</nova:creationTime>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        <nova:port uuid="c67ec97e-5456-48e8-9e98-b9075cc0b2aa">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <entry name="serial">d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</entry>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <entry name="uuid">d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</entry>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.config"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:87:29:eb"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <target dev="tapc67ec97e-54"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/console.log" append="off"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:18:49 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:18:49 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:18:49 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:18:49 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.232 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Preparing to wait for external event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.233 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.233 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.233 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.235 187156 DEBUG nova.virt.libvirt.vif [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:18:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.235 187156 DEBUG nova.network.os_vif_util [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.236 187156 DEBUG nova.network.os_vif_util [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.237 187156 DEBUG os_vif [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.239 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.239 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.240 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.243 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.243 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc67ec97e-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.244 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc67ec97e-54, col_values=(('external_ids', {'iface-id': 'c67ec97e-5456-48e8-9e98-b9075cc0b2aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:29:eb', 'vm-uuid': 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.245 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:49 np0005539504 NetworkManager[55210]: <info>  [1764400729.2467] manager: (tapc67ec97e-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.248 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.254 187156 INFO os_vif [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54')#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.734 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.735 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.735 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:87:29:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:18:49 np0005539504 nova_compute[187152]: 2025-11-29 07:18:49.736 187156 INFO nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Using config drive#033[00m
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.256 187156 INFO nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Creating config drive at /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.config#033[00m
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.268 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwaf2johy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.404 187156 DEBUG oslo_concurrency.processutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwaf2johy" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:50 np0005539504 kernel: tapc67ec97e-54: entered promiscuous mode
Nov 29 02:18:50 np0005539504 NetworkManager[55210]: <info>  [1764400730.4778] manager: (tapc67ec97e-54): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.477 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:18:50Z|00378|binding|INFO|Claiming lport c67ec97e-5456-48e8-9e98-b9075cc0b2aa for this chassis.
Nov 29 02:18:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:18:50Z|00379|binding|INFO|c67ec97e-5456-48e8-9e98-b9075cc0b2aa: Claiming fa:16:3e:87:29:eb 10.100.0.5
Nov 29 02:18:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:18:50Z|00380|binding|INFO|Setting lport c67ec97e-5456-48e8-9e98-b9075cc0b2aa ovn-installed in OVS
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.491 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.495 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:50 np0005539504 systemd-udevd[232679]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:18:50 np0005539504 systemd-machined[153423]: New machine qemu-52-instance-0000006d.
Nov 29 02:18:50 np0005539504 NetworkManager[55210]: <info>  [1764400730.5383] device (tapc67ec97e-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:18:50 np0005539504 NetworkManager[55210]: <info>  [1764400730.5395] device (tapc67ec97e-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:18:50 np0005539504 systemd[1]: Started Virtual Machine qemu-52-instance-0000006d.
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.685 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:18:50Z|00381|binding|INFO|Setting lport c67ec97e-5456-48e8-9e98-b9075cc0b2aa up in Southbound
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.913 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:29:eb 10.100.0.5'], port_security=['fa:16:3e:87:29:eb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a81715ba-eace-471d-9f71-9964fcbf6d85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=c67ec97e-5456-48e8-9e98-b9075cc0b2aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.915 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c67ec97e-5456-48e8-9e98-b9075cc0b2aa in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.918 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.930 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a420ce-f472-4c4b-b67a-9993c20cc658]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.931 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90812230-31 in ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.934 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90812230-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.934 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[be582586-4b3e-4b9a-ac79-898301dca83b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.935 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6bc0c6-0d73-4c85-85d9-08934092545e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.955 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[888ae0e3-3d78-40f6-ba96-b602c0191b64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:50.980 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[31dc4547-05dd-48f4-9055-82a5cd01e113]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.985 187156 DEBUG nova.network.neutron [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updated VIF entry in instance network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:18:50 np0005539504 nova_compute[187152]: 2025-11-29 07:18:50.986 187156 DEBUG nova.network.neutron [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.010 187156 DEBUG oslo_concurrency.lockutils [req-3aa9cf85-28a8-47b1-ba6f-a31907bbd494 req-9a3ef09a-568a-492f-8a58-159f38ac6365 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.018 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[baade677-a729-44a9-83f0-c88acd9e1f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 NetworkManager[55210]: <info>  [1764400731.0251] manager: (tap90812230-30): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Nov 29 02:18:51 np0005539504 systemd-udevd[232682]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.026 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[48768944-0a6b-4523-acbc-e28bdf202ada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.059 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[fb80079d-4cac-4fe2-9b3d-cf1ba5f91108]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.063 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[da5c9b5b-8136-4aa7-a699-5c96f1a5ffdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 NetworkManager[55210]: <info>  [1764400731.0914] device (tap90812230-30): carrier: link connected
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.096 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[22cf06ad-7892-4dc3-b3c3-765fe93e8d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.116 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[17dca6b2-e6b3-467c-b5cc-0ad40eacfcb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619396, 'reachable_time': 23493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232713, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.133 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[519eec25-8838-41f1-8453-175388149256]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:5f07'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 619396, 'tstamp': 619396}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232714, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.151 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f53fe41a-d387-4deb-96d6-ddeb373a40cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619396, 'reachable_time': 23493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232715, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.186 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f9fa8833-a620-4926-84fe-2f599e8e14a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.263 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[09fa6100-c6d5-49d7-8601-c83134056a30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.264 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.265 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.265 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:51 np0005539504 NetworkManager[55210]: <info>  [1764400731.2688] manager: (tap90812230-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.268 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:51 np0005539504 kernel: tap90812230-30: entered promiscuous mode
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.272 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:18:51Z|00382|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.290 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.291 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.292 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c8cc3a30-d28f-44a9-a627-e12c8a3e9014]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.293 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:18:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:18:51.293 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'env', 'PROCESS_TAG=haproxy-90812230-35cb-4e21-b16b-75b900100d8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90812230-35cb-4e21-b16b-75b900100d8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.386 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400731.3858693, d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.386 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] VM Started (Lifecycle Event)#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.460 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.465 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400731.3880496, d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.466 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.489 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.493 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:18:51 np0005539504 nova_compute[187152]: 2025-11-29 07:18:51.525 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:18:51 np0005539504 podman[232754]: 2025-11-29 07:18:51.66452194 +0000 UTC m=+0.022672122 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.587 187156 DEBUG nova.compute.manager [req-d7b1170f-4ad4-4efd-a2f9-53cd2bd7d0e7 req-2fff62f0-dea5-48c7-bb82-0e806afe2dcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.589 187156 DEBUG oslo_concurrency.lockutils [req-d7b1170f-4ad4-4efd-a2f9-53cd2bd7d0e7 req-2fff62f0-dea5-48c7-bb82-0e806afe2dcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:53 np0005539504 podman[232754]: 2025-11-29 07:18:53.59049777 +0000 UTC m=+1.948647922 container create 8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.589 187156 DEBUG oslo_concurrency.lockutils [req-d7b1170f-4ad4-4efd-a2f9-53cd2bd7d0e7 req-2fff62f0-dea5-48c7-bb82-0e806afe2dcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.590 187156 DEBUG oslo_concurrency.lockutils [req-d7b1170f-4ad4-4efd-a2f9-53cd2bd7d0e7 req-2fff62f0-dea5-48c7-bb82-0e806afe2dcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.590 187156 DEBUG nova.compute.manager [req-d7b1170f-4ad4-4efd-a2f9-53cd2bd7d0e7 req-2fff62f0-dea5-48c7-bb82-0e806afe2dcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Processing event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.592 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.598 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400733.5978208, d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.599 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.603 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.610 187156 INFO nova.virt.libvirt.driver [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Instance spawned successfully.#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.612 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:18:53 np0005539504 podman[232767]: 2025-11-29 07:18:53.625598619 +0000 UTC m=+0.957151756 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.633 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.643 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:18:53 np0005539504 systemd[1]: Started libpod-conmon-8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827.scope.
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.651 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.652 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.653 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.653 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.654 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.654 187156 DEBUG nova.virt.libvirt.driver [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:18:53 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:18:53 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdffdaca01b290de25ba2238b9e7493877e1bcf6302a7467d4680ab84a49678f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:18:53 np0005539504 nova_compute[187152]: 2025-11-29 07:18:53.708 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:18:53 np0005539504 podman[232754]: 2025-11-29 07:18:53.913878649 +0000 UTC m=+2.272028821 container init 8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:18:53 np0005539504 podman[232754]: 2025-11-29 07:18:53.922452996 +0000 UTC m=+2.280603148 container start 8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:18:53 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [NOTICE]   (232795) : New worker (232797) forked
Nov 29 02:18:53 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [NOTICE]   (232795) : Loading success.
Nov 29 02:18:54 np0005539504 nova_compute[187152]: 2025-11-29 07:18:54.057 187156 INFO nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Took 33.07 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:18:54 np0005539504 nova_compute[187152]: 2025-11-29 07:18:54.058 187156 DEBUG nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:54 np0005539504 nova_compute[187152]: 2025-11-29 07:18:54.248 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:54 np0005539504 nova_compute[187152]: 2025-11-29 07:18:54.807 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:55 np0005539504 nova_compute[187152]: 2025-11-29 07:18:55.779 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:56 np0005539504 nova_compute[187152]: 2025-11-29 07:18:56.030 187156 DEBUG nova.compute.manager [req-b5f30d0b-475b-481f-b226-c06b7a1cc74b req-e26cbc97-3202-40f4-a692-f0770f346775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:18:56 np0005539504 nova_compute[187152]: 2025-11-29 07:18:56.030 187156 DEBUG oslo_concurrency.lockutils [req-b5f30d0b-475b-481f-b226-c06b7a1cc74b req-e26cbc97-3202-40f4-a692-f0770f346775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:56 np0005539504 nova_compute[187152]: 2025-11-29 07:18:56.030 187156 DEBUG oslo_concurrency.lockutils [req-b5f30d0b-475b-481f-b226-c06b7a1cc74b req-e26cbc97-3202-40f4-a692-f0770f346775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:56 np0005539504 nova_compute[187152]: 2025-11-29 07:18:56.031 187156 DEBUG oslo_concurrency.lockutils [req-b5f30d0b-475b-481f-b226-c06b7a1cc74b req-e26cbc97-3202-40f4-a692-f0770f346775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:56 np0005539504 nova_compute[187152]: 2025-11-29 07:18:56.031 187156 DEBUG nova.compute.manager [req-b5f30d0b-475b-481f-b226-c06b7a1cc74b req-e26cbc97-3202-40f4-a692-f0770f346775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:18:56 np0005539504 nova_compute[187152]: 2025-11-29 07:18:56.031 187156 WARNING nova.compute.manager [req-b5f30d0b-475b-481f-b226-c06b7a1cc74b req-e26cbc97-3202-40f4-a692-f0770f346775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received unexpected event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa for instance with vm_state active and task_state None.#033[00m
Nov 29 02:18:57 np0005539504 nova_compute[187152]: 2025-11-29 07:18:57.635 187156 INFO nova.compute.manager [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Took 38.50 seconds to build instance.#033[00m
Nov 29 02:18:58 np0005539504 nova_compute[187152]: 2025-11-29 07:18:58.463 187156 DEBUG oslo_concurrency.lockutils [None req-68e3bab1-675c-44ca-9703-67a011c39306 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 42.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:59 np0005539504 nova_compute[187152]: 2025-11-29 07:18:59.253 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:00 np0005539504 nova_compute[187152]: 2025-11-29 07:19:00.782 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:01 np0005539504 nova_compute[187152]: 2025-11-29 07:19:01.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:02 np0005539504 nova_compute[187152]: 2025-11-29 07:19:02.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:03 np0005539504 nova_compute[187152]: 2025-11-29 07:19:03.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:04 np0005539504 nova_compute[187152]: 2025-11-29 07:19:04.258 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:04 np0005539504 nova_compute[187152]: 2025-11-29 07:19:04.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:04 np0005539504 nova_compute[187152]: 2025-11-29 07:19:04.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:19:05 np0005539504 nova_compute[187152]: 2025-11-29 07:19:05.787 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:05 np0005539504 nova_compute[187152]: 2025-11-29 07:19:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:06 np0005539504 podman[232827]: 2025-11-29 07:19:06.722741126 +0000 UTC m=+0.059037304 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:19:06 np0005539504 podman[232826]: 2025-11-29 07:19:06.726834654 +0000 UTC m=+0.064413956 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 29 02:19:06 np0005539504 podman[232825]: 2025-11-29 07:19:06.740178508 +0000 UTC m=+0.072760587 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:19:06 np0005539504 nova_compute[187152]: 2025-11-29 07:19:06.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:07Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:29:eb 10.100.0.5
Nov 29 02:19:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:07Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:29:eb 10.100.0.5
Nov 29 02:19:07 np0005539504 nova_compute[187152]: 2025-11-29 07:19:07.384 187156 DEBUG nova.compute.manager [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:07 np0005539504 nova_compute[187152]: 2025-11-29 07:19:07.385 187156 DEBUG nova.compute.manager [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing instance network info cache due to event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:07 np0005539504 nova_compute[187152]: 2025-11-29 07:19:07.385 187156 DEBUG oslo_concurrency.lockutils [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:07 np0005539504 nova_compute[187152]: 2025-11-29 07:19:07.386 187156 DEBUG oslo_concurrency.lockutils [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:07 np0005539504 nova_compute[187152]: 2025-11-29 07:19:07.386 187156 DEBUG nova.network.neutron [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:07.559 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:07.560 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:19:07 np0005539504 nova_compute[187152]: 2025-11-29 07:19:07.561 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:09 np0005539504 nova_compute[187152]: 2025-11-29 07:19:09.267 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:09.563 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:09 np0005539504 nova_compute[187152]: 2025-11-29 07:19:09.568 187156 DEBUG nova.network.neutron [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updated VIF entry in instance network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:09 np0005539504 nova_compute[187152]: 2025-11-29 07:19:09.569 187156 DEBUG nova.network.neutron [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:09 np0005539504 nova_compute[187152]: 2025-11-29 07:19:09.614 187156 DEBUG oslo_concurrency.lockutils [req-e1e282a3-a73f-4c9e-8797-adf589a58a84 req-d522b14f-9aa5-495f-8c1d-edaa73dfd238 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:09 np0005539504 podman[232888]: 2025-11-29 07:19:09.713358396 +0000 UTC m=+0.056131327 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:19:09 np0005539504 podman[232889]: 2025-11-29 07:19:09.743480074 +0000 UTC m=+0.082832333 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.832 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.841 187156 DEBUG nova.compute.manager [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.841 187156 DEBUG nova.compute.manager [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing instance network info cache due to event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.842 187156 DEBUG oslo_concurrency.lockutils [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.842 187156 DEBUG oslo_concurrency.lockutils [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.842 187156 DEBUG nova.network.neutron [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:19:10 np0005539504 nova_compute[187152]: 2025-11-29 07:19:10.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:19:11 np0005539504 nova_compute[187152]: 2025-11-29 07:19:11.162 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:11 np0005539504 nova_compute[187152]: 2025-11-29 07:19:11.162 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:11 np0005539504 nova_compute[187152]: 2025-11-29 07:19:11.163 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:19:11 np0005539504 nova_compute[187152]: 2025-11-29 07:19:11.163 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 531c3d01-115b-479d-bbdc-11e38bc8b0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:12 np0005539504 nova_compute[187152]: 2025-11-29 07:19:12.618 187156 DEBUG nova.network.neutron [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updated VIF entry in instance network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:12 np0005539504 nova_compute[187152]: 2025-11-29 07:19:12.619 187156 DEBUG nova.network.neutron [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:12 np0005539504 nova_compute[187152]: 2025-11-29 07:19:12.689 187156 DEBUG oslo_concurrency.lockutils [req-8f01aa7e-91b1-4b68-aa07-4766cda60bff req-c2715da8-0b50-451c-8ff3-944830ab258b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.422 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [{"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.449 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-531c3d01-115b-479d-bbdc-11e38bc8b0b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.450 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.450 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.479 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.479 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.480 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.480 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.564 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.621 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.622 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.677 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.685 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.749 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.750 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:13 np0005539504 nova_compute[187152]: 2025-11-29 07:19:13.820 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.054 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.056 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5383MB free_disk=73.13460540771484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.056 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.056 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.199 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.200 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.200 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.200 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.216 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.230 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.231 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.247 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.274 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.280 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.348 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.372 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.410 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:19:14 np0005539504 nova_compute[187152]: 2025-11-29 07:19:14.410 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:15Z|00383|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:19:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:15Z|00384|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:19:15 np0005539504 nova_compute[187152]: 2025-11-29 07:19:15.561 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:15 np0005539504 nova_compute[187152]: 2025-11-29 07:19:15.835 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:18 np0005539504 podman[232948]: 2025-11-29 07:19:18.761859789 +0000 UTC m=+0.097065391 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:19:19 np0005539504 nova_compute[187152]: 2025-11-29 07:19:19.283 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:20 np0005539504 nova_compute[187152]: 2025-11-29 07:19:20.838 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:22.957 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:22.958 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:22.958 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:24 np0005539504 nova_compute[187152]: 2025-11-29 07:19:24.289 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:24 np0005539504 podman[232966]: 2025-11-29 07:19:24.719861074 +0000 UTC m=+0.062695370 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:19:25 np0005539504 nova_compute[187152]: 2025-11-29 07:19:25.840 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:29 np0005539504 nova_compute[187152]: 2025-11-29 07:19:29.294 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.156 187156 DEBUG nova.compute.manager [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.157 187156 DEBUG nova.compute.manager [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing instance network info cache due to event network-changed-c67ec97e-5456-48e8-9e98-b9075cc0b2aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.158 187156 DEBUG oslo_concurrency.lockutils [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.158 187156 DEBUG oslo_concurrency.lockutils [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.158 187156 DEBUG nova.network.neutron [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:30 np0005539504 nova_compute[187152]: 2025-11-29 07:19:30.843 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:31 np0005539504 nova_compute[187152]: 2025-11-29 07:19:31.808 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:31 np0005539504 nova_compute[187152]: 2025-11-29 07:19:31.808 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:31 np0005539504 nova_compute[187152]: 2025-11-29 07:19:31.809 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:31 np0005539504 nova_compute[187152]: 2025-11-29 07:19:31.809 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:31 np0005539504 nova_compute[187152]: 2025-11-29 07:19:31.810 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:31 np0005539504 nova_compute[187152]: 2025-11-29 07:19:31.827 187156 INFO nova.compute.manager [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Terminating instance#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.023 187156 DEBUG nova.compute.manager [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:19:32 np0005539504 kernel: tapceb086b6-5a (unregistering): left promiscuous mode
Nov 29 02:19:32 np0005539504 NetworkManager[55210]: <info>  [1764400772.0704] device (tapceb086b6-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:19:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:32Z|00385|binding|INFO|Releasing lport ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 from this chassis (sb_readonly=0)
Nov 29 02:19:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:32Z|00386|binding|INFO|Setting lport ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 down in Southbound
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:19:32Z|00387|binding|INFO|Removing iface tapceb086b6-5a ovn-installed in OVS
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.091 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:f8:4b 10.100.0.3'], port_security=['fa:16:3e:19:f8:4b 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '531c3d01-115b-479d-bbdc-11e38bc8b0b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8547e4c2-e200-4173-9eba-476619f06150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.092 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.093 104164 INFO neutron.agent.ovn.metadata.agent [-] Port ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 unbound from our chassis#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.094 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df7cfc35-3f76-45b2-b70c-e4525d38f410, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.096 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f125e534-904b-4df0-afc4-cd9fa28565ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.097 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace which is not needed anymore#033[00m
Nov 29 02:19:32 np0005539504 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000063.scope: Deactivated successfully.
Nov 29 02:19:32 np0005539504 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000063.scope: Consumed 24.965s CPU time.
Nov 29 02:19:32 np0005539504 systemd-machined[153423]: Machine qemu-50-instance-00000063 terminated.
Nov 29 02:19:32 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [NOTICE]   (230017) : haproxy version is 2.8.14-c23fe91
Nov 29 02:19:32 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [NOTICE]   (230017) : path to executable is /usr/sbin/haproxy
Nov 29 02:19:32 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [WARNING]  (230017) : Exiting Master process...
Nov 29 02:19:32 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [WARNING]  (230017) : Exiting Master process...
Nov 29 02:19:32 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [ALERT]    (230017) : Current worker (230019) exited with code 143 (Terminated)
Nov 29 02:19:32 np0005539504 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230013]: [WARNING]  (230017) : All workers exited. Exiting... (0)
Nov 29 02:19:32 np0005539504 systemd[1]: libpod-6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41.scope: Deactivated successfully.
Nov 29 02:19:32 np0005539504 podman[233011]: 2025-11-29 07:19:32.27513539 +0000 UTC m=+0.076996409 container died 6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.292 187156 INFO nova.virt.libvirt.driver [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Instance destroyed successfully.#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.293 187156 DEBUG nova.objects.instance [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid 531c3d01-115b-479d-bbdc-11e38bc8b0b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.309 187156 DEBUG nova.virt.libvirt.vif [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1942369880',display_name='tempest-ServerActionsTestOtherB-server-1942369880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1942369880',id=99,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-1fd30l7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:55Z,user_data=None,user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=531c3d01-115b-479d-bbdc-11e38bc8b0b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.310 187156 DEBUG nova.network.os_vif_util [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "address": "fa:16:3e:19:f8:4b", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceb086b6-5a", "ovs_interfaceid": "ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.311 187156 DEBUG nova.network.os_vif_util [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.313 187156 DEBUG os_vif [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.316 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapceb086b6-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.318 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.321 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.326 187156 INFO os_vif [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:f8:4b,bridge_name='br-int',has_traffic_filtering=True,id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceb086b6-5a')#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.327 187156 INFO nova.virt.libvirt.driver [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Deleting instance files /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1_del#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.328 187156 INFO nova.virt.libvirt.driver [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Deletion of /var/lib/nova/instances/531c3d01-115b-479d-bbdc-11e38bc8b0b1_del complete#033[00m
Nov 29 02:19:32 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41-userdata-shm.mount: Deactivated successfully.
Nov 29 02:19:32 np0005539504 systemd[1]: var-lib-containers-storage-overlay-6694a130c68b4b21cb579e9875e88879f81cc789d53f83d70260cbdc6cce4cca-merged.mount: Deactivated successfully.
Nov 29 02:19:32 np0005539504 podman[233011]: 2025-11-29 07:19:32.433304607 +0000 UTC m=+0.235165616 container cleanup 6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:19:32 np0005539504 systemd[1]: libpod-conmon-6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41.scope: Deactivated successfully.
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.633 187156 INFO nova.compute.manager [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Took 0.61 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.634 187156 DEBUG oslo.service.loopingcall [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.634 187156 DEBUG nova.compute.manager [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.634 187156 DEBUG nova.network.neutron [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.650 187156 DEBUG nova.compute.manager [req-d1c2f991-cf94-4bd7-8733-471f0e7db74d req-58ab1f30-ed12-45ed-849f-943cb68769fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-vif-unplugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.650 187156 DEBUG oslo_concurrency.lockutils [req-d1c2f991-cf94-4bd7-8733-471f0e7db74d req-58ab1f30-ed12-45ed-849f-943cb68769fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.651 187156 DEBUG oslo_concurrency.lockutils [req-d1c2f991-cf94-4bd7-8733-471f0e7db74d req-58ab1f30-ed12-45ed-849f-943cb68769fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.651 187156 DEBUG oslo_concurrency.lockutils [req-d1c2f991-cf94-4bd7-8733-471f0e7db74d req-58ab1f30-ed12-45ed-849f-943cb68769fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.651 187156 DEBUG nova.compute.manager [req-d1c2f991-cf94-4bd7-8733-471f0e7db74d req-58ab1f30-ed12-45ed-849f-943cb68769fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] No waiting events found dispatching network-vif-unplugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.651 187156 DEBUG nova.compute.manager [req-d1c2f991-cf94-4bd7-8733-471f0e7db74d req-58ab1f30-ed12-45ed-849f-943cb68769fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-vif-unplugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:19:32 np0005539504 podman[233058]: 2025-11-29 07:19:32.92304105 +0000 UTC m=+0.467102265 container remove 6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.928 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f6493f32-b5ad-4ae9-ab83-3076613a420a]: (4, ('Sat Nov 29 07:19:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41)\n6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41\nSat Nov 29 07:19:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41)\n6af1ca13052103f8ae4a7b12ecd7fc16d2e0b9b1bb22be8d40c6c89eade1aa41\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.930 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6573ebd4-9440-4af2-86c8-3400513b7112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.931 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:32 np0005539504 kernel: tapdf7cfc35-30: left promiscuous mode
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.979 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 nova_compute[187152]: 2025-11-29 07:19:32.991 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:32.994 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[701c4b90-7532-4fee-963e-72e20dd97903]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:33.009 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[963dc6ea-b3e2-443a-9922-52570ed15901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:33.010 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[689f9c45-3dd1-4f02-8901-4457d4eae2a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:33.025 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[44d8468f-b86f-4d1b-9838-4557b5613c59]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 583819, 'reachable_time': 21167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233073, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:33.028 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:19:33 np0005539504 systemd[1]: run-netns-ovnmeta\x2ddf7cfc35\x2d3f76\x2d45b2\x2db70c\x2de4525d38f410.mount: Deactivated successfully.
Nov 29 02:19:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:19:33.028 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb007a2-adf0-4269-a97a-29004de5f197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:33 np0005539504 nova_compute[187152]: 2025-11-29 07:19:33.433 187156 DEBUG nova.network.neutron [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updated VIF entry in instance network info cache for port c67ec97e-5456-48e8-9e98-b9075cc0b2aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:33 np0005539504 nova_compute[187152]: 2025-11-29 07:19:33.434 187156 DEBUG nova.network.neutron [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:33 np0005539504 nova_compute[187152]: 2025-11-29 07:19:33.836 187156 DEBUG oslo_concurrency.lockutils [req-5c6257b2-1d0b-4543-b933-67d54c8b2fcc req-f82f17d6-4d0b-496e-b1a2-24cffce59e02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.845 187156 DEBUG nova.compute.manager [req-74d23064-d5fb-4e95-a6b4-58dcca16e814 req-06b8448b-133f-4477-a947-9228a1c5454a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.845 187156 DEBUG oslo_concurrency.lockutils [req-74d23064-d5fb-4e95-a6b4-58dcca16e814 req-06b8448b-133f-4477-a947-9228a1c5454a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.846 187156 DEBUG oslo_concurrency.lockutils [req-74d23064-d5fb-4e95-a6b4-58dcca16e814 req-06b8448b-133f-4477-a947-9228a1c5454a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.846 187156 DEBUG oslo_concurrency.lockutils [req-74d23064-d5fb-4e95-a6b4-58dcca16e814 req-06b8448b-133f-4477-a947-9228a1c5454a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.846 187156 DEBUG nova.compute.manager [req-74d23064-d5fb-4e95-a6b4-58dcca16e814 req-06b8448b-133f-4477-a947-9228a1c5454a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] No waiting events found dispatching network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.846 187156 WARNING nova.compute.manager [req-74d23064-d5fb-4e95-a6b4-58dcca16e814 req-06b8448b-133f-4477-a947-9228a1c5454a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received unexpected event network-vif-plugged-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:19:35 np0005539504 nova_compute[187152]: 2025-11-29 07:19:35.847 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.146 187156 DEBUG nova.network.neutron [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.153 187156 DEBUG nova.compute.manager [req-ac8226e9-63f0-45cf-b2f1-bcf729dd1053 req-f608be3e-b103-4c31-b581-016e91012d2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Received event network-vif-deleted-ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.153 187156 INFO nova.compute.manager [req-ac8226e9-63f0-45cf-b2f1-bcf729dd1053 req-f608be3e-b103-4c31-b581-016e91012d2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Neutron deleted interface ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.154 187156 DEBUG nova.network.neutron [req-ac8226e9-63f0-45cf-b2f1-bcf729dd1053 req-f608be3e-b103-4c31-b581-016e91012d2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.394 187156 INFO nova.compute.manager [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Took 3.76 seconds to deallocate network for instance.#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.404 187156 DEBUG oslo_concurrency.lockutils [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.405 187156 DEBUG oslo_concurrency.lockutils [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.405 187156 DEBUG nova.objects.instance [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.407 187156 DEBUG nova.compute.manager [req-ac8226e9-63f0-45cf-b2f1-bcf729dd1053 req-f608be3e-b103-4c31-b581-016e91012d2e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Detach interface failed, port_id=ceb086b6-5a5c-468a-a64c-5ae9cd0f32d3, reason: Instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.687 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.689 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:36 np0005539504 nova_compute[187152]: 2025-11-29 07:19:36.787 187156 DEBUG nova.compute.provider_tree [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:19:37 np0005539504 nova_compute[187152]: 2025-11-29 07:19:37.056 187156 DEBUG nova.scheduler.client.report [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:19:37 np0005539504 nova_compute[187152]: 2025-11-29 07:19:37.148 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:37 np0005539504 nova_compute[187152]: 2025-11-29 07:19:37.189 187156 INFO nova.scheduler.client.report [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Deleted allocations for instance 531c3d01-115b-479d-bbdc-11e38bc8b0b1#033[00m
Nov 29 02:19:37 np0005539504 nova_compute[187152]: 2025-11-29 07:19:37.319 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:37 np0005539504 nova_compute[187152]: 2025-11-29 07:19:37.637 187156 DEBUG oslo_concurrency.lockutils [None req-8815214b-c581-4364-a746-ff7d7eb976ec ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "531c3d01-115b-479d-bbdc-11e38bc8b0b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:37 np0005539504 podman[233074]: 2025-11-29 07:19:37.732218058 +0000 UTC m=+0.061615912 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:19:37 np0005539504 podman[233076]: 2025-11-29 07:19:37.732235379 +0000 UTC m=+0.057467972 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:19:37 np0005539504 podman[233075]: 2025-11-29 07:19:37.737252411 +0000 UTC m=+0.066399378 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9)
Nov 29 02:19:38 np0005539504 nova_compute[187152]: 2025-11-29 07:19:38.134 187156 DEBUG nova.objects.instance [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_requests' on Instance uuid d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:38 np0005539504 nova_compute[187152]: 2025-11-29 07:19:38.231 187156 DEBUG nova.network.neutron [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:19:38 np0005539504 nova_compute[187152]: 2025-11-29 07:19:38.865 187156 DEBUG nova.policy [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:19:40 np0005539504 podman[233134]: 2025-11-29 07:19:40.768007605 +0000 UTC m=+0.106399217 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:19:40 np0005539504 podman[233135]: 2025-11-29 07:19:40.83733094 +0000 UTC m=+0.166689554 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 02:19:40 np0005539504 nova_compute[187152]: 2025-11-29 07:19:40.846 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:42 np0005539504 nova_compute[187152]: 2025-11-29 07:19:42.321 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.385 187156 DEBUG nova.network.neutron [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Successfully updated port: b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.507 187156 DEBUG oslo_concurrency.lockutils [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.508 187156 DEBUG oslo_concurrency.lockutils [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.508 187156 DEBUG nova.network.neutron [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.524 187156 DEBUG nova.compute.manager [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-changed-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.525 187156 DEBUG nova.compute.manager [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing instance network info cache due to event network-changed-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.525 187156 DEBUG oslo_concurrency.lockutils [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.847 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:45 np0005539504 nova_compute[187152]: 2025-11-29 07:19:45.906 187156 WARNING nova.network.neutron [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:19:47 np0005539504 nova_compute[187152]: 2025-11-29 07:19:47.291 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400772.2897437, 531c3d01-115b-479d-bbdc-11e38bc8b0b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:47 np0005539504 nova_compute[187152]: 2025-11-29 07:19:47.292 187156 INFO nova.compute.manager [-] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:19:47 np0005539504 nova_compute[187152]: 2025-11-29 07:19:47.324 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:49 np0005539504 podman[233181]: 2025-11-29 07:19:49.757454873 +0000 UTC m=+0.093266209 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:19:50 np0005539504 nova_compute[187152]: 2025-11-29 07:19:50.850 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:52 np0005539504 nova_compute[187152]: 2025-11-29 07:19:52.325 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:52 np0005539504 nova_compute[187152]: 2025-11-29 07:19:52.858 187156 DEBUG nova.compute.manager [None req-b2597181-7815-462f-bb47-a42036e00fb7 - - - - - -] [instance: 531c3d01-115b-479d-bbdc-11e38bc8b0b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:54 np0005539504 nova_compute[187152]: 2025-11-29 07:19:54.896 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:55 np0005539504 podman[233201]: 2025-11-29 07:19:55.772213381 +0000 UTC m=+0.101964810 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:19:55 np0005539504 nova_compute[187152]: 2025-11-29 07:19:55.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:57 np0005539504 nova_compute[187152]: 2025-11-29 07:19:57.327 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:00 np0005539504 nova_compute[187152]: 2025-11-29 07:20:00.855 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:01 np0005539504 nova_compute[187152]: 2025-11-29 07:20:01.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:01 np0005539504 nova_compute[187152]: 2025-11-29 07:20:01.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:02 np0005539504 nova_compute[187152]: 2025-11-29 07:20:02.370 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:02 np0005539504 nova_compute[187152]: 2025-11-29 07:20:02.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.735 187156 DEBUG nova.network.neutron [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.856 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.938 187156 DEBUG oslo_concurrency.lockutils [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.940 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.940 187156 DEBUG oslo_concurrency.lockutils [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.941 187156 DEBUG nova.network.neutron [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Refreshing network info cache for port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.945 187156 DEBUG nova.virt.libvirt.vif [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.946 187156 DEBUG nova.network.os_vif_util [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.947 187156 DEBUG nova.network.os_vif_util [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.948 187156 DEBUG os_vif [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.948 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.949 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.949 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.953 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1f37250-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.954 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1f37250-55, col_values=(('external_ids', {'iface-id': 'b1f37250-55e3-4fc4-a9bb-2dedac4d03f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:f7:1c', 'vm-uuid': 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.955 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539504 NetworkManager[55210]: <info>  [1764400805.9572] manager: (tapb1f37250-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.959 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.963 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.965 187156 INFO os_vif [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55')#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.966 187156 DEBUG nova.virt.libvirt.vif [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.967 187156 DEBUG nova.network.os_vif_util [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.967 187156 DEBUG nova.network.os_vif_util [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.975 187156 DEBUG nova.virt.libvirt.guest [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:20:05 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:2f:f7:1c"/>
Nov 29 02:20:05 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 02:20:05 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:20:05 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 02:20:05 np0005539504 nova_compute[187152]:  <target dev="tapb1f37250-55"/>
Nov 29 02:20:05 np0005539504 nova_compute[187152]: </interface>
Nov 29 02:20:05 np0005539504 nova_compute[187152]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:20:05 np0005539504 kernel: tapb1f37250-55: entered promiscuous mode
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.991 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539504 NetworkManager[55210]: <info>  [1764400805.9931] manager: (tapb1f37250-55): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Nov 29 02:20:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:05Z|00388|binding|INFO|Claiming lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for this chassis.
Nov 29 02:20:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:05Z|00389|binding|INFO|b1f37250-55e3-4fc4-a9bb-2dedac4d03f5: Claiming fa:16:3e:2f:f7:1c 10.100.0.7
Nov 29 02:20:05 np0005539504 nova_compute[187152]: 2025-11-29 07:20:05.995 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:06Z|00390|binding|INFO|Setting lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 ovn-installed in OVS
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.006 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:06 np0005539504 systemd-udevd[233231]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:20:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:06Z|00391|binding|INFO|Setting lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 up in Southbound
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.042 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:f7:1c 10.100.0.7'], port_security=['fa:16:3e:2f:f7:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '7', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.043 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:20:06 np0005539504 NetworkManager[55210]: <info>  [1764400806.0446] device (tapb1f37250-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.044 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:20:06 np0005539504 NetworkManager[55210]: <info>  [1764400806.0462] device (tapb1f37250-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.061 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[03f78fb3-42b5-489b-bfd5-fa581ef34d12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.093 187156 DEBUG nova.virt.libvirt.driver [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.093 187156 DEBUG nova.virt.libvirt.driver [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.094 187156 DEBUG nova.virt.libvirt.driver [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:87:29:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.094 187156 DEBUG nova.virt.libvirt.driver [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:2f:f7:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.095 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdeaf02-33b9-4377-88a0-934dd52bddd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.099 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac5008e-ddbb-4eda-8624-1c6fc707d9a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.124 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[576cded0-63ef-4248-8ffb-3cbbab6ff093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.142 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cb89a36c-50ef-4d5d-b723-9c7b4da03e4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619396, 'reachable_time': 23493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233239, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.156 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[99ba438a-4a78-4cf2-a973-3aba041e259a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 619410, 'tstamp': 619410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233240, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 619413, 'tstamp': 619413}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233240, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.157 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.159 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.160 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.160 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.160 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.161 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:06.161 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.274 187156 DEBUG nova.virt.libvirt.guest [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:name>tempest-tempest.common.compute-instance-1725835497</nova:name>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:20:06</nova:creationTime>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:port uuid="c67ec97e-5456-48e8-9e98-b9075cc0b2aa">
Nov 29 02:20:06 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    <nova:port uuid="b1f37250-55e3-4fc4-a9bb-2dedac4d03f5">
Nov 29 02:20:06 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:06 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:20:06 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:20:06 np0005539504 nova_compute[187152]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.315 187156 DEBUG oslo_concurrency.lockutils [None req-36058a71-c1de-4770-a6b0-99e892e1db7d 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 29.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.962 187156 DEBUG nova.compute.manager [req-958d52d6-e317-4720-b5cb-b47c2837e2a1 req-6a316aa1-7447-4d09-bbc4-3a03e8cff82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.963 187156 DEBUG oslo_concurrency.lockutils [req-958d52d6-e317-4720-b5cb-b47c2837e2a1 req-6a316aa1-7447-4d09-bbc4-3a03e8cff82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.963 187156 DEBUG oslo_concurrency.lockutils [req-958d52d6-e317-4720-b5cb-b47c2837e2a1 req-6a316aa1-7447-4d09-bbc4-3a03e8cff82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.963 187156 DEBUG oslo_concurrency.lockutils [req-958d52d6-e317-4720-b5cb-b47c2837e2a1 req-6a316aa1-7447-4d09-bbc4-3a03e8cff82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.964 187156 DEBUG nova.compute.manager [req-958d52d6-e317-4720-b5cb-b47c2837e2a1 req-6a316aa1-7447-4d09-bbc4-3a03e8cff82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:06 np0005539504 nova_compute[187152]: 2025-11-29 07:20:06.964 187156 WARNING nova.compute.manager [req-958d52d6-e317-4720-b5cb-b47c2837e2a1 req-6a316aa1-7447-4d09-bbc4-3a03e8cff82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received unexpected event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:07Z|00392|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.187 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.314 187156 DEBUG oslo_concurrency.lockutils [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.314 187156 DEBUG oslo_concurrency.lockutils [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.329 187156 DEBUG nova.objects.instance [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.356 187156 DEBUG nova.virt.libvirt.vif [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.357 187156 DEBUG nova.network.os_vif_util [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.357 187156 DEBUG nova.network.os_vif_util [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.361 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.364 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.366 187156 DEBUG nova.virt.libvirt.driver [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Attempting to detach device tapb1f37250-55 from instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.366 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:2f:f7:1c"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <target dev="tapb1f37250-55"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </interface>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.375 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.379 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface>not found in domain: <domain type='kvm' id='52'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <name>instance-0000006d</name>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <uuid>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</uuid>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:name>tempest-tempest.common.compute-instance-1725835497</nova:name>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:20:06</nova:creationTime>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:port uuid="c67ec97e-5456-48e8-9e98-b9075cc0b2aa">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:port uuid="b1f37250-55e3-4fc4-a9bb-2dedac4d03f5">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <memory unit='KiB'>131072</memory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <resource>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <partition>/machine</partition>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </resource>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <sysinfo type='smbios'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='serial'>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='uuid'>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <boot dev='hd'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <smbios mode='sysinfo'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <vmcoreinfo state='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <feature policy='require' name='x2apic'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <feature policy='require' name='vme'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <clock offset='utc'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <timer name='hpet' present='no'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <on_reboot>restart</on_reboot>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <on_crash>destroy</on_crash>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <disk type='file' device='disk'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk' index='2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <backingStore type='file' index='3'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <format type='raw'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <backingStore/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      </backingStore>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='vda' bus='virtio'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='virtio-disk0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <disk type='file' device='cdrom'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.config' index='1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <backingStore/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='sda' bus='sata'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <readonly/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='sata0-0-0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pcie.0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='1' port='0x10'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='2' port='0x11'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='3' port='0x12'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='4' port='0x13'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='5' port='0x14'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='6' port='0x15'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='7' port='0x16'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='8' port='0x17'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.8'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='9' port='0x18'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.9'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='10' port='0x19'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.10'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='11' port='0x1a'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.11'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='12' port='0x1b'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.12'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='13' port='0x1c'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.13'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='14' port='0x1d'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.14'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='15' port='0x1e'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.15'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='16' port='0x1f'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.16'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='17' port='0x20'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.17'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='18' port='0x21'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.18'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='19' port='0x22'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.19'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='20' port='0x23'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.20'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='21' port='0x24'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.21'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='22' port='0x25'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.22'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='23' port='0x26'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.23'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='24' port='0x27'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.24'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='25' port='0x28'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.25'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-pci-bridge'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.26'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='usb'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='sata' index='0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='ide'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:87:29:eb'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='tapc67ec97e-54'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='net0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:2f:f7:1c'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='tapb1f37250-55'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='net1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <serial type='pty'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/console.log' append='off'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target type='isa-serial' port='0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <model name='isa-serial'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      </target>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/console.log' append='off'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target type='serial' port='0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </console>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <input type='tablet' bus='usb'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='input0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <input type='mouse' bus='ps2'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='input1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <input type='keyboard' bus='ps2'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='input2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <listen type='address' address='::0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <audio id='1' type='none'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='video0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <watchdog model='itco' action='reset'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='watchdog0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </watchdog>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <memballoon model='virtio'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <stats period='10'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='balloon0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <rng model='virtio'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='rng0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <label>system_u:system_r:svirt_t:s0:c625,c1014</label>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c625,c1014</imagelabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <label>+107:+107</label>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.380 187156 INFO nova.virt.libvirt.driver [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tapb1f37250-55 from instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 from the persistent domain config.#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.381 187156 DEBUG nova.virt.libvirt.driver [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] (1/8): Attempting to detach device tapb1f37250-55 with device alias net1 from instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.381 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:2f:f7:1c"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <target dev="tapb1f37250-55"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </interface>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:20:07 np0005539504 kernel: tapb1f37250-55 (unregistering): left promiscuous mode
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.477 187156 DEBUG nova.network.neutron [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updated VIF entry in instance network info cache for port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.478 187156 DEBUG nova.network.neutron [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:07 np0005539504 NetworkManager[55210]: <info>  [1764400807.4788] device (tapb1f37250-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.486 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:07Z|00393|binding|INFO|Releasing lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 from this chassis (sb_readonly=0)
Nov 29 02:20:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:07Z|00394|binding|INFO|Setting lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 down in Southbound
Nov 29 02:20:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:07Z|00395|binding|INFO|Removing iface tapb1f37250-55 ovn-installed in OVS
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.488 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.490 187156 DEBUG nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Received event <DeviceRemovedEvent: 1764400807.4898002, d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.492 187156 DEBUG nova.virt.libvirt.driver [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Start waiting for the detach event from libvirt for device tapb1f37250-55 with device alias net1 for instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.493 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.495 187156 DEBUG oslo_concurrency.lockutils [req-7f0d28a3-4c77-44ca-a461-1de6f1c13dca req-50f2de04-2685-4af5-b083-e79d51affa63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.495 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:f7:1c 10.100.0.7'], port_security=['fa:16:3e:2f:f7:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '9', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.497 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.498 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface>not found in domain: <domain type='kvm' id='52'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <name>instance-0000006d</name>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <uuid>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</uuid>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:name>tempest-tempest.common.compute-instance-1725835497</nova:name>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:20:06</nova:creationTime>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:port uuid="c67ec97e-5456-48e8-9e98-b9075cc0b2aa">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:port uuid="b1f37250-55e3-4fc4-a9bb-2dedac4d03f5">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <memory unit='KiB'>131072</memory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <resource>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <partition>/machine</partition>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </resource>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <sysinfo type='smbios'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='serial'>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='uuid'>d67996f4-2f09-4188-bcf5-ae5a02b6d2d3</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <boot dev='hd'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <smbios mode='sysinfo'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <vmcoreinfo state='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <feature policy='require' name='x2apic'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <feature policy='require' name='vme'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <clock offset='utc'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <timer name='hpet' present='no'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <on_reboot>restart</on_reboot>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <on_crash>destroy</on_crash>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <disk type='file' device='disk'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk' index='2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <backingStore type='file' index='3'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <format type='raw'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <backingStore/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      </backingStore>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='vda' bus='virtio'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='virtio-disk0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <disk type='file' device='cdrom'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/disk.config' index='1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <backingStore/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='sda' bus='sata'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <readonly/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='sata0-0-0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pcie.0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='1' port='0x10'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='2' port='0x11'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='3' port='0x12'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='4' port='0x13'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='5' port='0x14'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='6' port='0x15'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='7' port='0x16'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='8' port='0x17'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.8'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='9' port='0x18'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.9'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='10' port='0x19'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.10'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='11' port='0x1a'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.11'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='12' port='0x1b'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.12'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='13' port='0x1c'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.13'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='14' port='0x1d'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.14'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='15' port='0x1e'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.15'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='16' port='0x1f'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.16'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='17' port='0x20'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.17'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='18' port='0x21'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.18'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='19' port='0x22'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.19'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='20' port='0x23'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.20'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='21' port='0x24'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.21'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='22' port='0x25'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.22'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='23' port='0x26'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.23'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='24' port='0x27'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.24'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target chassis='25' port='0x28'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.25'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model name='pcie-pci-bridge'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='pci.26'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='usb'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <controller type='sata' index='0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='ide'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:87:29:eb'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target dev='tapc67ec97e-54'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='net0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <serial type='pty'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/console.log' append='off'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target type='isa-serial' port='0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:        <model name='isa-serial'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      </target>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3/console.log' append='off'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <target type='serial' port='0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </console>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <input type='tablet' bus='usb'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='input0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <input type='mouse' bus='ps2'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='input1'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <input type='keyboard' bus='ps2'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='input2'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <listen type='address' address='::0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <audio id='1' type='none'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='video0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <watchdog model='itco' action='reset'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='watchdog0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </watchdog>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <memballoon model='virtio'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <stats period='10'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='balloon0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <rng model='virtio'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <alias name='rng0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <label>system_u:system_r:svirt_t:s0:c625,c1014</label>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c625,c1014</imagelabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <label>+107:+107</label>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.500 187156 INFO nova.virt.libvirt.driver [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tapb1f37250-55 from instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 from the live domain config.#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.499 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.501 187156 DEBUG nova.virt.libvirt.vif [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.501 187156 DEBUG nova.network.os_vif_util [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.502 187156 DEBUG nova.network.os_vif_util [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.502 187156 DEBUG os_vif [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.504 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.504 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f37250-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.505 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.506 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.508 187156 INFO os_vif [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55')#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.509 187156 DEBUG nova.virt.libvirt.guest [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:name>tempest-tempest.common.compute-instance-1725835497</nova:name>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:20:07</nova:creationTime>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    <nova:port uuid="c67ec97e-5456-48e8-9e98-b9075cc0b2aa">
Nov 29 02:20:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:20:07 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:20:07 np0005539504 nova_compute[187152]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.525 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a629b1bd-1255-432f-96fe-f030a5590af2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.560 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[30d2fee1-9abe-4e8e-8bee-ee34e21d5737]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.564 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[739b6ca1-a873-43e7-beb4-f20aa90c40bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.586 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3a6192-855b-450f-a89c-2e674d0295cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.606 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4152e6ad-c025-4fe4-91a4-0cfbb32b8734]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619396, 'reachable_time': 23493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233250, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.628 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3b8678-9a02-4c67-9e0c-3b00dfadf88e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 619410, 'tstamp': 619410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233251, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 619413, 'tstamp': 619413}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233251, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.629 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:07 np0005539504 nova_compute[187152]: 2025-11-29 07:20:07.631 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.632 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.632 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.633 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:07.633 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:08 np0005539504 podman[233256]: 2025-11-29 07:20:08.746199619 +0000 UTC m=+0.066145292 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:20:08 np0005539504 podman[233255]: 2025-11-29 07:20:08.754977081 +0000 UTC m=+0.073126196 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Nov 29 02:20:08 np0005539504 podman[233254]: 2025-11-29 07:20:08.772512806 +0000 UTC m=+0.091885674 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:20:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:08.796 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:08.797 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:20:08 np0005539504 nova_compute[187152]: 2025-11-29 07:20:08.797 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:08 np0005539504 nova_compute[187152]: 2025-11-29 07:20:08.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.054 187156 DEBUG nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.054 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.055 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.055 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.055 187156 DEBUG nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.055 187156 WARNING nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received unexpected event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.056 187156 DEBUG nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-unplugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.056 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.056 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.056 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.057 187156 DEBUG nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-unplugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.057 187156 WARNING nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received unexpected event network-vif-unplugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.057 187156 DEBUG nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.057 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.058 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.058 187156 DEBUG oslo_concurrency.lockutils [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.058 187156 DEBUG nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.058 187156 WARNING nova.compute.manager [req-2305ec49-a51c-44f3-91db-fdf2718b22c1 req-1e3eddf0-f09a-4770-935b-8a7e9916edcc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received unexpected event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.167 187156 DEBUG oslo_concurrency.lockutils [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.168 187156 DEBUG oslo_concurrency.lockutils [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:09 np0005539504 nova_compute[187152]: 2025-11-29 07:20:09.168 187156 DEBUG nova.network.neutron [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.235 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.235 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.236 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.236 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.237 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.253 187156 INFO nova.compute.manager [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Terminating instance#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.269 187156 DEBUG nova.compute.manager [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:20:10 np0005539504 kernel: tapc67ec97e-54 (unregistering): left promiscuous mode
Nov 29 02:20:10 np0005539504 NetworkManager[55210]: <info>  [1764400810.2963] device (tapc67ec97e-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:20:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:10Z|00396|binding|INFO|Releasing lport c67ec97e-5456-48e8-9e98-b9075cc0b2aa from this chassis (sb_readonly=0)
Nov 29 02:20:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:10Z|00397|binding|INFO|Setting lport c67ec97e-5456-48e8-9e98-b9075cc0b2aa down in Southbound
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.323 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:10Z|00398|binding|INFO|Removing iface tapc67ec97e-54 ovn-installed in OVS
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.328 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:10.340 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:29:eb 10.100.0.5'], port_security=['fa:16:3e:87:29:eb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a81715ba-eace-471d-9f71-9964fcbf6d85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=c67ec97e-5456-48e8-9e98-b9075cc0b2aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.342 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:10.342 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c67ec97e-5456-48e8-9e98-b9075cc0b2aa in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:20:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:10.344 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90812230-35cb-4e21-b16b-75b900100d8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:20:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:10.345 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b3421c4a-da15-464f-8ba4-e970da31a94b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:10.345 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace which is not needed anymore#033[00m
Nov 29 02:20:10 np0005539504 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Nov 29 02:20:10 np0005539504 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006d.scope: Consumed 16.056s CPU time.
Nov 29 02:20:10 np0005539504 systemd-machined[153423]: Machine qemu-52-instance-0000006d terminated.
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.488 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.527 187156 INFO nova.virt.libvirt.driver [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Instance destroyed successfully.#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.528 187156 DEBUG nova.objects.instance [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'resources' on Instance uuid d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.859 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [NOTICE]   (232795) : haproxy version is 2.8.14-c23fe91
Nov 29 02:20:10 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [NOTICE]   (232795) : path to executable is /usr/sbin/haproxy
Nov 29 02:20:10 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [WARNING]  (232795) : Exiting Master process...
Nov 29 02:20:10 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [WARNING]  (232795) : Exiting Master process...
Nov 29 02:20:10 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [ALERT]    (232795) : Current worker (232797) exited with code 143 (Terminated)
Nov 29 02:20:10 np0005539504 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232790]: [WARNING]  (232795) : All workers exited. Exiting... (0)
Nov 29 02:20:10 np0005539504 systemd[1]: libpod-8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827.scope: Deactivated successfully.
Nov 29 02:20:10 np0005539504 podman[233354]: 2025-11-29 07:20:10.909583483 +0000 UTC m=+0.473037473 container died 8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.991 187156 DEBUG nova.virt.libvirt.vif [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.991 187156 DEBUG nova.network.os_vif_util [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.992 187156 DEBUG nova.network.os_vif_util [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.992 187156 DEBUG os_vif [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.993 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.993 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc67ec97e-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.995 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.996 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.999 187156 INFO os_vif [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:29:eb,bridge_name='br-int',has_traffic_filtering=True,id=c67ec97e-5456-48e8-9e98-b9075cc0b2aa,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc67ec97e-54')#033[00m
Nov 29 02:20:10 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.999 187156 DEBUG nova.virt.libvirt.vif [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1725835497',display_name='tempest-tempest.common.compute-instance-1725835497',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1725835497',id=109,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-fdt4piuu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d67996f4-2f09-4188-bcf5-ae5a02b6d2d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:10.999 187156 DEBUG nova.network.os_vif_util [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.000 187156 DEBUG nova.network.os_vif_util [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.000 187156 DEBUG os_vif [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.001 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.001 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f37250-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.001 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.002 187156 INFO os_vif [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55')#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.003 187156 INFO nova.virt.libvirt.driver [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Deleting instance files /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3_del#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.003 187156 INFO nova.virt.libvirt.driver [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Deletion of /var/lib/nova/instances/d67996f4-2f09-4188-bcf5-ae5a02b6d2d3_del complete#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.506 187156 DEBUG nova.compute.manager [req-0e4316cb-628b-45ad-bc69-4303d49bf35b req-b5a2a143-b3ad-482e-8ad4-1f5fa1bca640 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-unplugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.507 187156 DEBUG oslo_concurrency.lockutils [req-0e4316cb-628b-45ad-bc69-4303d49bf35b req-b5a2a143-b3ad-482e-8ad4-1f5fa1bca640 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.507 187156 DEBUG oslo_concurrency.lockutils [req-0e4316cb-628b-45ad-bc69-4303d49bf35b req-b5a2a143-b3ad-482e-8ad4-1f5fa1bca640 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.507 187156 DEBUG oslo_concurrency.lockutils [req-0e4316cb-628b-45ad-bc69-4303d49bf35b req-b5a2a143-b3ad-482e-8ad4-1f5fa1bca640 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.508 187156 DEBUG nova.compute.manager [req-0e4316cb-628b-45ad-bc69-4303d49bf35b req-b5a2a143-b3ad-482e-8ad4-1f5fa1bca640 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-unplugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.508 187156 DEBUG nova.compute.manager [req-0e4316cb-628b-45ad-bc69-4303d49bf35b req-b5a2a143-b3ad-482e-8ad4-1f5fa1bca640 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-unplugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.528 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:11 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827-userdata-shm.mount: Deactivated successfully.
Nov 29 02:20:11 np0005539504 systemd[1]: var-lib-containers-storage-overlay-bdffdaca01b290de25ba2238b9e7493877e1bcf6302a7467d4680ab84a49678f-merged.mount: Deactivated successfully.
Nov 29 02:20:11 np0005539504 podman[233354]: 2025-11-29 07:20:11.713481461 +0000 UTC m=+1.276935481 container cleanup 8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.746 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.746 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.746 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.746 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:20:11 np0005539504 podman[233386]: 2025-11-29 07:20:11.769435422 +0000 UTC m=+0.829083656 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.789 187156 INFO nova.compute.manager [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Took 1.52 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.791 187156 DEBUG oslo.service.loopingcall [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.825 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.826 187156 DEBUG nova.compute.manager [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.826 187156 DEBUG nova.network.neutron [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:20:11 np0005539504 podman[233392]: 2025-11-29 07:20:11.831998879 +0000 UTC m=+0.890688647 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:20:11 np0005539504 systemd[1]: libpod-conmon-8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827.scope: Deactivated successfully.
Nov 29 02:20:11 np0005539504 podman[233426]: 2025-11-29 07:20:11.843701178 +0000 UTC m=+0.087389164 container remove 8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.848 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d973bb26-cc3e-405e-beb2-e48c954a73d0]: (4, ('Sat Nov 29 07:20:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827)\n8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827\nSat Nov 29 07:20:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827)\n8f5e547d947ffa86f46ee4d4e908818b376e077456dee79e5d816219dc6bf827\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.850 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2d053de3-1ff8-470e-8707-c31d3a327646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.851 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:11 np0005539504 kernel: tap90812230-30: left promiscuous mode
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.868 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Error from libvirt while getting description of instance-0000006d: [Error Code 42] Domain not found: no domain with matching uuid 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3' (instance-0000006d): libvirt.libvirtError: Domain not found: no domain with matching uuid 'd67996f4-2f09-4188-bcf5-ae5a02b6d2d3' (instance-0000006d)#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.869 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[21100993-f97e-4769-b70c-b5f08da3ce3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.886 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[69114070-11f4-473a-8b4b-18ab926d13b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.887 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdeef1a-4856-4601-9ab4-298d20b8875c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.901 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1571e22d-7165-4f56-b284-da4e4b22d1d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619389, 'reachable_time': 23061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233468, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.903 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:20:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:11.904 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[dc67592a-51a6-477f-8a7c-bcadb495c649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:11 np0005539504 systemd[1]: run-netns-ovnmeta\x2d90812230\x2d35cb\x2d4e21\x2db16b\x2d75b900100d8b.mount: Deactivated successfully.
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.983 187156 INFO nova.network.neutron [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 29 02:20:11 np0005539504 nova_compute[187152]: 2025-11-29 07:20:11.984 187156 DEBUG nova.network.neutron [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [{"id": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "address": "fa:16:3e:87:29:eb", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc67ec97e-54", "ovs_interfaceid": "c67ec97e-5456-48e8-9e98-b9075cc0b2aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.007 187156 DEBUG oslo_concurrency.lockutils [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.039 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.042 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5676MB free_disk=73.19247055053711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.043 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.043 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.045 187156 DEBUG oslo_concurrency.lockutils [None req-210379fd-05a5-40fc-841e-ecf229a20b76 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.141 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.142 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.143 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.206 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.229 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.260 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.260 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.668 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.669 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.849 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9907#033[00m
Nov 29 02:20:12 np0005539504 nova_compute[187152]: 2025-11-29 07:20:12.850 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.006 187156 DEBUG nova.compute.manager [req-3a9ed461-2108-48dd-9087-86fbb3b0b527 req-c97fb4e5-c1ba-46c4-877b-3f5ac4f85240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.007 187156 DEBUG oslo_concurrency.lockutils [req-3a9ed461-2108-48dd-9087-86fbb3b0b527 req-c97fb4e5-c1ba-46c4-877b-3f5ac4f85240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.008 187156 DEBUG oslo_concurrency.lockutils [req-3a9ed461-2108-48dd-9087-86fbb3b0b527 req-c97fb4e5-c1ba-46c4-877b-3f5ac4f85240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.008 187156 DEBUG oslo_concurrency.lockutils [req-3a9ed461-2108-48dd-9087-86fbb3b0b527 req-c97fb4e5-c1ba-46c4-877b-3f5ac4f85240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.009 187156 DEBUG nova.compute.manager [req-3a9ed461-2108-48dd-9087-86fbb3b0b527 req-c97fb4e5-c1ba-46c4-877b-3f5ac4f85240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] No waiting events found dispatching network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.009 187156 WARNING nova.compute.manager [req-3a9ed461-2108-48dd-9087-86fbb3b0b527 req-c97fb4e5-c1ba-46c4-877b-3f5ac4f85240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received unexpected event network-vif-plugged-c67ec97e-5456-48e8-9e98-b9075cc0b2aa for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.388 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "02495078-f8df-4710-88ff-c7f6ffd03a8b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.390 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "02495078-f8df-4710-88ff-c7f6ffd03a8b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.425 187156 DEBUG nova.network.neutron [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.428 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.476 187156 INFO nova.compute.manager [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Took 2.65 seconds to deallocate network for instance.#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.556 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.557 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.563 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.564 187156 INFO nova.compute.claims [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.588 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.604 187156 DEBUG nova.compute.manager [req-670ac4ce-5bfd-4faa-a874-67465183eea4 req-cbd76889-cacb-4e94-8223-358969422e75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Received event network-vif-deleted-c67ec97e-5456-48e8-9e98-b9075cc0b2aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.755 187156 DEBUG nova.compute.provider_tree [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.771 187156 DEBUG nova.scheduler.client.report [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.800 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.801 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.805 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.885 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.908 187156 INFO nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.927 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.935 187156 DEBUG nova.compute.provider_tree [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.951 187156 DEBUG nova.scheduler.client.report [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:14 np0005539504 nova_compute[187152]: 2025-11-29 07:20:14.991 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.016 187156 INFO nova.scheduler.client.report [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Deleted allocations for instance d67996f4-2f09-4188-bcf5-ae5a02b6d2d3#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.092 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.094 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.094 187156 INFO nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Creating image(s)#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.095 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "/var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.095 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "/var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.096 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "/var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.110 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.151 187156 DEBUG oslo_concurrency.lockutils [None req-884f2725-3495-40fe-9f95-a3fbefd2dbfe 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d67996f4-2f09-4188-bcf5-ae5a02b6d2d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.170 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.170 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.171 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.182 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.248 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.250 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.289 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.290 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.291 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.351 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.352 187156 DEBUG nova.virt.disk.api [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Checking if we can resize image /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.353 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.411 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.412 187156 DEBUG nova.virt.disk.api [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Cannot resize image /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.412 187156 DEBUG nova.objects.instance [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lazy-loading 'migration_context' on Instance uuid 02495078-f8df-4710-88ff-c7f6ffd03a8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.441 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.443 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Ensure instance console log exists: /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.444 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.444 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.445 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.447 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.452 187156 WARNING nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.457 187156 DEBUG nova.virt.libvirt.host [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.457 187156 DEBUG nova.virt.libvirt.host [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.460 187156 DEBUG nova.virt.libvirt.host [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.461 187156 DEBUG nova.virt.libvirt.host [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.462 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.462 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.463 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.463 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.463 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.464 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.464 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.464 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.464 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.464 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.465 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.465 187156 DEBUG nova.virt.hardware [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.469 187156 DEBUG nova.objects.instance [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02495078-f8df-4710-88ff-c7f6ffd03a8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.485 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <uuid>02495078-f8df-4710-88ff-c7f6ffd03a8b</uuid>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <name>instance-0000006f</name>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServersAaction247Test-server-1512677640</nova:name>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:20:15</nova:creationTime>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:user uuid="aee87ea3204d40ca8ec3c43cc58518f9">tempest-ServersAaction247Test-305933376-project-member</nova:user>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:        <nova:project uuid="6a27a736bec54d6dab1ee70bb0c320a1">tempest-ServersAaction247Test-305933376</nova:project>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <entry name="serial">02495078-f8df-4710-88ff-c7f6ffd03a8b</entry>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <entry name="uuid">02495078-f8df-4710-88ff-c7f6ffd03a8b</entry>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.config"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/console.log" append="off"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:20:15 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:20:15 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:20:15 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:20:15 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.548 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.548 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.550 187156 INFO nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Using config drive#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.719 187156 INFO nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Creating config drive at /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.config#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.726 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbv8sjubb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.852 187156 DEBUG oslo_concurrency.processutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbv8sjubb" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.861 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:15 np0005539504 systemd-machined[153423]: New machine qemu-53-instance-0000006f.
Nov 29 02:20:15 np0005539504 systemd[1]: Started Virtual Machine qemu-53-instance-0000006f.
Nov 29 02:20:15 np0005539504 nova_compute[187152]: 2025-11-29 07:20:15.995 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.673 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400816.6733003, 02495078-f8df-4710-88ff-c7f6ffd03a8b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.674 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.678 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.678 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.685 187156 INFO nova.virt.libvirt.driver [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Instance spawned successfully.#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.685 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.709 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.715 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.718 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.719 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.719 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.719 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.720 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.720 187156 DEBUG nova.virt.libvirt.driver [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.753 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.753 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400816.674147, 02495078-f8df-4710-88ff-c7f6ffd03a8b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.753 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.782 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.785 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.898 187156 INFO nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Took 1.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.899 187156 DEBUG nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:16 np0005539504 nova_compute[187152]: 2025-11-29 07:20:16.901 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:20:17 np0005539504 nova_compute[187152]: 2025-11-29 07:20:17.030 187156 INFO nova.compute.manager [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Took 2.51 seconds to build instance.#033[00m
Nov 29 02:20:17 np0005539504 nova_compute[187152]: 2025-11-29 07:20:17.054 187156 DEBUG oslo_concurrency.lockutils [None req-812f6887-6f16-4034-a199-b99b12518073 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "02495078-f8df-4710-88ff-c7f6ffd03a8b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.141 187156 DEBUG nova.compute.manager [None req-d34a655a-bc2a-439e-ac02-3428c5c843e6 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.207 187156 INFO nova.compute.manager [None req-d34a655a-bc2a-439e-ac02-3428c5c843e6 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] instance snapshotting#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.208 187156 DEBUG nova.objects.instance [None req-d34a655a-bc2a-439e-ac02-3428c5c843e6 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lazy-loading 'flavor' on Instance uuid 02495078-f8df-4710-88ff-c7f6ffd03a8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.299 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "02495078-f8df-4710-88ff-c7f6ffd03a8b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.300 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "02495078-f8df-4710-88ff-c7f6ffd03a8b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.300 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "02495078-f8df-4710-88ff-c7f6ffd03a8b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.300 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "02495078-f8df-4710-88ff-c7f6ffd03a8b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.301 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "02495078-f8df-4710-88ff-c7f6ffd03a8b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.314 187156 INFO nova.compute.manager [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Terminating instance#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.326 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "refresh_cache-02495078-f8df-4710-88ff-c7f6ffd03a8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.327 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquired lock "refresh_cache-02495078-f8df-4710-88ff-c7f6ffd03a8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.327 187156 DEBUG nova.network.neutron [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.484 187156 INFO nova.virt.libvirt.driver [None req-d34a655a-bc2a-439e-ac02-3428c5c843e6 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Beginning live snapshot process#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.488 187156 DEBUG nova.network.neutron [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.558 187156 DEBUG nova.compute.manager [None req-d34a655a-bc2a-439e-ac02-3428c5c843e6 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.791 187156 DEBUG nova.network.neutron [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.806 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Releasing lock "refresh_cache-02495078-f8df-4710-88ff-c7f6ffd03a8b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:19 np0005539504 nova_compute[187152]: 2025-11-29 07:20:19.807 187156 DEBUG nova.compute.manager [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:20:19 np0005539504 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Nov 29 02:20:19 np0005539504 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006f.scope: Consumed 3.904s CPU time.
Nov 29 02:20:19 np0005539504 systemd-machined[153423]: Machine qemu-53-instance-0000006f terminated.
Nov 29 02:20:19 np0005539504 podman[233511]: 2025-11-29 07:20:19.929471897 +0000 UTC m=+0.073666901 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:20:20 np0005539504 nova_compute[187152]: 2025-11-29 07:20:20.048 187156 INFO nova.virt.libvirt.driver [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Instance destroyed successfully.#033[00m
Nov 29 02:20:20 np0005539504 nova_compute[187152]: 2025-11-29 07:20:20.049 187156 DEBUG nova.objects.instance [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lazy-loading 'resources' on Instance uuid 02495078-f8df-4710-88ff-c7f6ffd03a8b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:20 np0005539504 nova_compute[187152]: 2025-11-29 07:20:20.864 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:20 np0005539504 nova_compute[187152]: 2025-11-29 07:20:20.997 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.154 187156 INFO nova.virt.libvirt.driver [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Deleting instance files /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b_del#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.156 187156 INFO nova.virt.libvirt.driver [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Deletion of /var/lib/nova/instances/02495078-f8df-4710-88ff-c7f6ffd03a8b_del complete#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.524 187156 DEBUG nova.compute.manager [None req-d34a655a-bc2a-439e-ac02-3428c5c843e6 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.562 187156 INFO nova.compute.manager [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Took 1.76 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.564 187156 DEBUG oslo.service.loopingcall [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.564 187156 DEBUG nova.compute.manager [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:20:21 np0005539504 nova_compute[187152]: 2025-11-29 07:20:21.564 187156 DEBUG nova.network.neutron [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.046 187156 DEBUG nova.network.neutron [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.061 187156 DEBUG nova.network.neutron [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.074 187156 INFO nova.compute.manager [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Took 0.51 seconds to deallocate network for instance.#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.145 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.146 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.233 187156 DEBUG nova.compute.provider_tree [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.247 187156 DEBUG nova.scheduler.client.report [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.412 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.435 187156 INFO nova.scheduler.client.report [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Deleted allocations for instance 02495078-f8df-4710-88ff-c7f6ffd03a8b#033[00m
Nov 29 02:20:22 np0005539504 nova_compute[187152]: 2025-11-29 07:20:22.560 187156 DEBUG oslo_concurrency.lockutils [None req-f2c9e95b-3924-48fd-b0fa-9422ad40b9b0 aee87ea3204d40ca8ec3c43cc58518f9 6a27a736bec54d6dab1ee70bb0c320a1 - - default default] Lock "02495078-f8df-4710-88ff-c7f6ffd03a8b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:22.959 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:22.960 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:22.960 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:25 np0005539504 nova_compute[187152]: 2025-11-29 07:20:25.526 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400810.5258458, d67996f4-2f09-4188-bcf5-ae5a02b6d2d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:25 np0005539504 nova_compute[187152]: 2025-11-29 07:20:25.527 187156 INFO nova.compute.manager [-] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:20:25 np0005539504 nova_compute[187152]: 2025-11-29 07:20:25.545 187156 DEBUG nova.compute.manager [None req-357ddb45-1cd0-46af-a9d6-af57a761aee9 - - - - - -] [instance: d67996f4-2f09-4188-bcf5-ae5a02b6d2d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:25 np0005539504 nova_compute[187152]: 2025-11-29 07:20:25.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:26 np0005539504 nova_compute[187152]: 2025-11-29 07:20:26.000 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:26 np0005539504 podman[233542]: 2025-11-29 07:20:26.730719224 +0000 UTC m=+0.072307515 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 29 02:20:30 np0005539504 nova_compute[187152]: 2025-11-29 07:20:30.868 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:31 np0005539504 nova_compute[187152]: 2025-11-29 07:20:31.002 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:31 np0005539504 nova_compute[187152]: 2025-11-29 07:20:31.409 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:31 np0005539504 nova_compute[187152]: 2025-11-29 07:20:31.612 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:35 np0005539504 nova_compute[187152]: 2025-11-29 07:20:35.048 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400820.0465255, 02495078-f8df-4710-88ff-c7f6ffd03a8b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:35 np0005539504 nova_compute[187152]: 2025-11-29 07:20:35.049 187156 INFO nova.compute.manager [-] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:20:35 np0005539504 nova_compute[187152]: 2025-11-29 07:20:35.080 187156 DEBUG nova.compute.manager [None req-fc02e04b-e7bf-41a0-8879-9b8756bea48f - - - - - -] [instance: 02495078-f8df-4710-88ff-c7f6ffd03a8b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:35 np0005539504 nova_compute[187152]: 2025-11-29 07:20:35.869 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:36 np0005539504 nova_compute[187152]: 2025-11-29 07:20:36.004 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:39 np0005539504 podman[233563]: 2025-11-29 07:20:39.722726049 +0000 UTC m=+0.061999702 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:20:39 np0005539504 podman[233565]: 2025-11-29 07:20:39.743420096 +0000 UTC m=+0.065753071 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:20:39 np0005539504 podman[233564]: 2025-11-29 07:20:39.764479714 +0000 UTC m=+0.098433326 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:20:40 np0005539504 nova_compute[187152]: 2025-11-29 07:20:40.871 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:41 np0005539504 nova_compute[187152]: 2025-11-29 07:20:41.007 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:42 np0005539504 podman[233626]: 2025-11-29 07:20:42.703582841 +0000 UTC m=+0.051008130 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:20:42 np0005539504 podman[233627]: 2025-11-29 07:20:42.775296369 +0000 UTC m=+0.113073234 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:20:45 np0005539504 nova_compute[187152]: 2025-11-29 07:20:45.872 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539504 nova_compute[187152]: 2025-11-29 07:20:46.008 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.970 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.972 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.972 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.972 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.972 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.972 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.973 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.973 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.973 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.973 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.973 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.973 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:20:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.308 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.309 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.351 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.487 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.488 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.497 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.498 187156 INFO nova.compute.claims [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.621 187156 DEBUG nova.compute.provider_tree [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.637 187156 DEBUG nova.scheduler.client.report [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.656 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.657 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.719 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.719 187156 DEBUG nova.network.neutron [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.738 187156 INFO nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:20:48 np0005539504 nova_compute[187152]: 2025-11-29 07:20:48.807 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.083 187156 DEBUG nova.policy [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fc775c91abe1474384c43851c4d778ac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'be19c52d747e41d1b61aa112facb4cfd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.579 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.580 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.580 187156 INFO nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Creating image(s)#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.581 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "/var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.581 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "/var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.582 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "/var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.593 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.683 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.684 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.685 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.700 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.773 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.774 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.822 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.823 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.823 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.883 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.884 187156 DEBUG nova.virt.disk.api [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Checking if we can resize image /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.885 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.943 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.945 187156 DEBUG nova.virt.disk.api [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Cannot resize image /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.945 187156 DEBUG nova.objects.instance [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lazy-loading 'migration_context' on Instance uuid 01e083b9-6c94-4591-acac-3307cbb301c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.966 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.967 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Ensure instance console log exists: /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.967 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.968 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:49 np0005539504 nova_compute[187152]: 2025-11-29 07:20:49.968 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:50 np0005539504 nova_compute[187152]: 2025-11-29 07:20:50.000 187156 DEBUG nova.network.neutron [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Successfully created port: 6a63e05b-f75c-43e1-a545-b83b9813f5f8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:20:50 np0005539504 podman[233689]: 2025-11-29 07:20:50.722623722 +0000 UTC m=+0.065143155 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:20:50 np0005539504 nova_compute[187152]: 2025-11-29 07:20:50.875 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.275 187156 DEBUG nova.network.neutron [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Successfully updated port: 6a63e05b-f75c-43e1-a545-b83b9813f5f8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.294 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "refresh_cache-01e083b9-6c94-4591-acac-3307cbb301c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.294 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquired lock "refresh_cache-01e083b9-6c94-4591-acac-3307cbb301c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.295 187156 DEBUG nova.network.neutron [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.375 187156 DEBUG nova.compute.manager [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received event network-changed-6a63e05b-f75c-43e1-a545-b83b9813f5f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.376 187156 DEBUG nova.compute.manager [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Refreshing instance network info cache due to event network-changed-6a63e05b-f75c-43e1-a545-b83b9813f5f8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.376 187156 DEBUG oslo_concurrency.lockutils [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-01e083b9-6c94-4591-acac-3307cbb301c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:51 np0005539504 nova_compute[187152]: 2025-11-29 07:20:51.453 187156 DEBUG nova.network.neutron [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.565 187156 DEBUG nova.network.neutron [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Updating instance_info_cache with network_info: [{"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.590 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Releasing lock "refresh_cache-01e083b9-6c94-4591-acac-3307cbb301c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.590 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Instance network_info: |[{"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.591 187156 DEBUG oslo_concurrency.lockutils [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-01e083b9-6c94-4591-acac-3307cbb301c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.591 187156 DEBUG nova.network.neutron [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Refreshing network info cache for port 6a63e05b-f75c-43e1-a545-b83b9813f5f8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.594 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Start _get_guest_xml network_info=[{"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.598 187156 WARNING nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.605 187156 DEBUG nova.virt.libvirt.host [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.606 187156 DEBUG nova.virt.libvirt.host [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.612 187156 DEBUG nova.virt.libvirt.host [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.613 187156 DEBUG nova.virt.libvirt.host [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.614 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.615 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.615 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.615 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.616 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.616 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.616 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.616 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.617 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.617 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.617 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.617 187156 DEBUG nova.virt.hardware [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.621 187156 DEBUG nova.virt.libvirt.vif [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:20:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1892802691',display_name='tempest-ServerAddressesTestJSON-server-1892802691',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1892802691',id=112,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='be19c52d747e41d1b61aa112facb4cfd',ramdisk_id='',reservation_id='r-63gz2fnc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-807109868',owner_user_name='tempest-ServerAddressesTestJSON-807109868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:49Z,user_data=None,user_id='fc775c91abe1474384c43851c4d778ac',uuid=01e083b9-6c94-4591-acac-3307cbb301c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.622 187156 DEBUG nova.network.os_vif_util [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Converting VIF {"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.623 187156 DEBUG nova.network.os_vif_util [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.624 187156 DEBUG nova.objects.instance [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lazy-loading 'pci_devices' on Instance uuid 01e083b9-6c94-4591-acac-3307cbb301c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.641 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <uuid>01e083b9-6c94-4591-acac-3307cbb301c3</uuid>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <name>instance-00000070</name>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerAddressesTestJSON-server-1892802691</nova:name>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:20:52</nova:creationTime>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:user uuid="fc775c91abe1474384c43851c4d778ac">tempest-ServerAddressesTestJSON-807109868-project-member</nova:user>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:project uuid="be19c52d747e41d1b61aa112facb4cfd">tempest-ServerAddressesTestJSON-807109868</nova:project>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        <nova:port uuid="6a63e05b-f75c-43e1-a545-b83b9813f5f8">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <entry name="serial">01e083b9-6c94-4591-acac-3307cbb301c3</entry>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <entry name="uuid">01e083b9-6c94-4591-acac-3307cbb301c3</entry>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.config"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:f8:d7:ea"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <target dev="tap6a63e05b-f7"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/console.log" append="off"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:20:52 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:20:52 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:20:52 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:20:52 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.642 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Preparing to wait for external event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.642 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.643 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.643 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.644 187156 DEBUG nova.virt.libvirt.vif [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:20:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1892802691',display_name='tempest-ServerAddressesTestJSON-server-1892802691',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1892802691',id=112,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='be19c52d747e41d1b61aa112facb4cfd',ramdisk_id='',reservation_id='r-63gz2fnc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-807109868',owner_user_name='tempest-ServerAddressesTestJSON-807109868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:49Z,user_data=None,user_id='fc775c91abe1474384c43851c4d778ac',uuid=01e083b9-6c94-4591-acac-3307cbb301c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.644 187156 DEBUG nova.network.os_vif_util [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Converting VIF {"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.645 187156 DEBUG nova.network.os_vif_util [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.646 187156 DEBUG os_vif [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.647 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.647 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.651 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a63e05b-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.651 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a63e05b-f7, col_values=(('external_ids', {'iface-id': '6a63e05b-f75c-43e1-a545-b83b9813f5f8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:d7:ea', 'vm-uuid': '01e083b9-6c94-4591-acac-3307cbb301c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.653 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:52 np0005539504 NetworkManager[55210]: <info>  [1764400852.6546] manager: (tap6a63e05b-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.655 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.663 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.664 187156 INFO os_vif [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7')#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.725 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.726 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.726 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] No VIF found with MAC fa:16:3e:f8:d7:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:20:52 np0005539504 nova_compute[187152]: 2025-11-29 07:20:52.726 187156 INFO nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Using config drive#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.579 187156 INFO nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Creating config drive at /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.config#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.584 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjetbsmi8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.713 187156 DEBUG oslo_concurrency.processutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjetbsmi8" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:53 np0005539504 kernel: tap6a63e05b-f7: entered promiscuous mode
Nov 29 02:20:53 np0005539504 NetworkManager[55210]: <info>  [1764400853.7786] manager: (tap6a63e05b-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Nov 29 02:20:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:53Z|00399|binding|INFO|Claiming lport 6a63e05b-f75c-43e1-a545-b83b9813f5f8 for this chassis.
Nov 29 02:20:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:53Z|00400|binding|INFO|6a63e05b-f75c-43e1-a545-b83b9813f5f8: Claiming fa:16:3e:f8:d7:ea 10.100.0.3
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.778 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.781 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.785 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.790 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.801 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:d7:ea 10.100.0.3'], port_security=['fa:16:3e:f8:d7:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '01e083b9-6c94-4591-acac-3307cbb301c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'be19c52d747e41d1b61aa112facb4cfd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d75dc1b-a737-4daf-aac3-38b44b0ec6f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac91e7c9-9f79-4a97-afac-696d8c3d4df6, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6a63e05b-f75c-43e1-a545-b83b9813f5f8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.803 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6a63e05b-f75c-43e1-a545-b83b9813f5f8 in datapath 3771f30a-11cb-4a94-8e85-de07aa43fda4 bound to our chassis#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.805 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3771f30a-11cb-4a94-8e85-de07aa43fda4#033[00m
Nov 29 02:20:53 np0005539504 systemd-udevd[233729]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.818 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[957b5cf2-62a6-4e74-894e-cd689124a436]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.819 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3771f30a-11 in ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:20:53 np0005539504 systemd-machined[153423]: New machine qemu-54-instance-00000070.
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.822 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3771f30a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.822 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50c73305-61fc-42bd-967e-4d72c2e56b2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 NetworkManager[55210]: <info>  [1764400853.8239] device (tap6a63e05b-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.823 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[48dd47b0-55ad-4198-8233-7b7abf9f5597]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 NetworkManager[55210]: <info>  [1764400853.8257] device (tap6a63e05b-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.836 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[543cb6b0-63ca-444a-b0da-de05156af03f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:53Z|00401|binding|INFO|Setting lport 6a63e05b-f75c-43e1-a545-b83b9813f5f8 ovn-installed in OVS
Nov 29 02:20:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:53Z|00402|binding|INFO|Setting lport 6a63e05b-f75c-43e1-a545-b83b9813f5f8 up in Southbound
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.846 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.848 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[22297843-ff4d-4b7f-aa5b-4140201b1a3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 systemd[1]: Started Virtual Machine qemu-54-instance-00000070.
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.878 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5b9784-c7d0-4a9a-bc16-fda60e59a560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 NetworkManager[55210]: <info>  [1764400853.8859] manager: (tap3771f30a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/184)
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.886 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b0121884-329c-4e74-bf2e-c0ec8943d449]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.916 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a80601-c53e-4fe1-ba40-f91c7e4293e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.919 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[41a2f4d1-0ee6-43fa-bad0-9acd813f0ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 nova_compute[187152]: 2025-11-29 07:20:53.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:53 np0005539504 NetworkManager[55210]: <info>  [1764400853.9442] device (tap3771f30a-10): carrier: link connected
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.951 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdfbf3a-ce49-4afd-b938-07301be1f359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.972 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7db380-f00e-4540-bf61-b48e1c2123b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3771f30a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:88:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 631682, 'reachable_time': 31799, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233763, 'error': None, 'target': 'ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:53.987 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[11827e89-381c-43e7-a0c0-50ed16286112]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:881e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 631682, 'tstamp': 631682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233764, 'error': None, 'target': 'ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.002 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8eac19ee-4ed3-4c98-bec5-d575cf5f5db8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3771f30a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:88:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 631682, 'reachable_time': 31799, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233765, 'error': None, 'target': 'ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.030 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[724aa839-9884-42ed-98e5-e470177de356]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.085 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8276b648-b192-4c0c-bf97-0f362a9ff3a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.086 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3771f30a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.087 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.087 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3771f30a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.131 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:54 np0005539504 kernel: tap3771f30a-10: entered promiscuous mode
Nov 29 02:20:54 np0005539504 NetworkManager[55210]: <info>  [1764400854.1370] manager: (tap3771f30a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.137 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.138 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3771f30a-10, col_values=(('external_ids', {'iface-id': '956b529c-8ead-437e-af1d-285a960c37d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.139 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.141 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.141 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3771f30a-11cb-4a94-8e85-de07aa43fda4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3771f30a-11cb-4a94-8e85-de07aa43fda4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:20:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:54Z|00403|binding|INFO|Releasing lport 956b529c-8ead-437e-af1d-285a960c37d4 from this chassis (sb_readonly=0)
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.142 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a568d8-b5c9-4b17-b138-61a0158d4636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.143 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-3771f30a-11cb-4a94-8e85-de07aa43fda4
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/3771f30a-11cb-4a94-8e85-de07aa43fda4.pid.haproxy
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 3771f30a-11cb-4a94-8e85-de07aa43fda4
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:20:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:54.144 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'env', 'PROCESS_TAG=haproxy-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3771f30a-11cb-4a94-8e85-de07aa43fda4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.152 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.277 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400854.2768364, 01e083b9-6c94-4591-acac-3307cbb301c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.278 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] VM Started (Lifecycle Event)#033[00m
Nov 29 02:20:54 np0005539504 podman[233802]: 2025-11-29 07:20:54.501357221 +0000 UTC m=+0.023553422 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.622 187156 DEBUG nova.network.neutron [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Updated VIF entry in instance network info cache for port 6a63e05b-f75c-43e1-a545-b83b9813f5f8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:20:54 np0005539504 nova_compute[187152]: 2025-11-29 07:20:54.623 187156 DEBUG nova.network.neutron [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Updating instance_info_cache with network_info: [{"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.509 187156 DEBUG nova.compute.manager [req-7fcbdc1c-da6d-4caf-8641-4c68fe9bff59 req-2c7e402f-b080-4ed8-abee-982b7dd02c46 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.510 187156 DEBUG oslo_concurrency.lockutils [req-7fcbdc1c-da6d-4caf-8641-4c68fe9bff59 req-2c7e402f-b080-4ed8-abee-982b7dd02c46 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.511 187156 DEBUG oslo_concurrency.lockutils [req-7fcbdc1c-da6d-4caf-8641-4c68fe9bff59 req-2c7e402f-b080-4ed8-abee-982b7dd02c46 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.512 187156 DEBUG oslo_concurrency.lockutils [req-7fcbdc1c-da6d-4caf-8641-4c68fe9bff59 req-2c7e402f-b080-4ed8-abee-982b7dd02c46 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.512 187156 DEBUG nova.compute.manager [req-7fcbdc1c-da6d-4caf-8641-4c68fe9bff59 req-2c7e402f-b080-4ed8-abee-982b7dd02c46 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Processing event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.514 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.518 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.520 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.525 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.530 187156 INFO nova.virt.libvirt.driver [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Instance spawned successfully.#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.532 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:20:55 np0005539504 podman[233802]: 2025-11-29 07:20:55.731355956 +0000 UTC m=+1.253552177 container create 9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.737 187156 DEBUG oslo_concurrency.lockutils [req-e633d46d-7935-43da-8a24-03c6bc5274c2 req-157f74f7-8237-4d8a-a556-40c5e9b159d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-01e083b9-6c94-4591-acac-3307cbb301c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.747 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.747 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400854.2769961, 01e083b9-6c94-4591-acac-3307cbb301c3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.748 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.756 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.757 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.757 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.758 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.758 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.759 187156 DEBUG nova.virt.libvirt.driver [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.767 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.772 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400855.5187576, 01e083b9-6c94-4591-acac-3307cbb301c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.772 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:20:55 np0005539504 systemd[1]: Started libpod-conmon-9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964.scope.
Nov 29 02:20:55 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:20:55 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84db81950332e4e1adb1e6b63e4e3780983b18cd8bc383053c4fe79da7e57299/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:20:55 np0005539504 nova_compute[187152]: 2025-11-29 07:20:55.877 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:56 np0005539504 nova_compute[187152]: 2025-11-29 07:20:56.000 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:56 np0005539504 nova_compute[187152]: 2025-11-29 07:20:56.005 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:56 np0005539504 podman[233802]: 2025-11-29 07:20:56.033560469 +0000 UTC m=+1.555756670 container init 9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:20:56 np0005539504 podman[233802]: 2025-11-29 07:20:56.04180267 +0000 UTC m=+1.563998871 container start 9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:20:56 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [NOTICE]   (233821) : New worker (233823) forked
Nov 29 02:20:56 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [NOTICE]   (233821) : Loading success.
Nov 29 02:20:56 np0005539504 nova_compute[187152]: 2025-11-29 07:20:56.115 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:20:56 np0005539504 nova_compute[187152]: 2025-11-29 07:20:56.449 187156 INFO nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Took 6.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:20:56 np0005539504 nova_compute[187152]: 2025-11-29 07:20:56.449 187156 DEBUG nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.131 187156 INFO nova.compute.manager [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Took 8.72 seconds to build instance.#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.167 187156 DEBUG oslo_concurrency.lockutils [None req-a6dc5e1e-0f80-4f6e-8a6f-2d2a4c8bc37f fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.643 187156 DEBUG nova.compute.manager [req-9d3415da-6c7a-4dc3-961e-5e5f7c333dec req-a09e7bb5-c705-4fbb-9bca-9246606be303 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.643 187156 DEBUG oslo_concurrency.lockutils [req-9d3415da-6c7a-4dc3-961e-5e5f7c333dec req-a09e7bb5-c705-4fbb-9bca-9246606be303 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.644 187156 DEBUG oslo_concurrency.lockutils [req-9d3415da-6c7a-4dc3-961e-5e5f7c333dec req-a09e7bb5-c705-4fbb-9bca-9246606be303 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.644 187156 DEBUG oslo_concurrency.lockutils [req-9d3415da-6c7a-4dc3-961e-5e5f7c333dec req-a09e7bb5-c705-4fbb-9bca-9246606be303 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.645 187156 DEBUG nova.compute.manager [req-9d3415da-6c7a-4dc3-961e-5e5f7c333dec req-a09e7bb5-c705-4fbb-9bca-9246606be303 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] No waiting events found dispatching network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.645 187156 WARNING nova.compute.manager [req-9d3415da-6c7a-4dc3-961e-5e5f7c333dec req-a09e7bb5-c705-4fbb-9bca-9246606be303 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received unexpected event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:57 np0005539504 nova_compute[187152]: 2025-11-29 07:20:57.653 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:57 np0005539504 podman[233832]: 2025-11-29 07:20:57.741859064 +0000 UTC m=+0.077839706 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.022 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.023 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.023 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.024 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.024 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.037 187156 INFO nova.compute.manager [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Terminating instance#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.051 187156 DEBUG nova.compute.manager [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:20:58 np0005539504 kernel: tap6a63e05b-f7 (unregistering): left promiscuous mode
Nov 29 02:20:58 np0005539504 NetworkManager[55210]: <info>  [1764400858.0717] device (tap6a63e05b-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.078 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:58Z|00404|binding|INFO|Releasing lport 6a63e05b-f75c-43e1-a545-b83b9813f5f8 from this chassis (sb_readonly=0)
Nov 29 02:20:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:58Z|00405|binding|INFO|Setting lport 6a63e05b-f75c-43e1-a545-b83b9813f5f8 down in Southbound
Nov 29 02:20:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:20:58Z|00406|binding|INFO|Removing iface tap6a63e05b-f7 ovn-installed in OVS
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.086 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:d7:ea 10.100.0.3'], port_security=['fa:16:3e:f8:d7:ea 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '01e083b9-6c94-4591-acac-3307cbb301c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'be19c52d747e41d1b61aa112facb4cfd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d75dc1b-a737-4daf-aac3-38b44b0ec6f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac91e7c9-9f79-4a97-afac-696d8c3d4df6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6a63e05b-f75c-43e1-a545-b83b9813f5f8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.088 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6a63e05b-f75c-43e1-a545-b83b9813f5f8 in datapath 3771f30a-11cb-4a94-8e85-de07aa43fda4 unbound from our chassis#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.090 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3771f30a-11cb-4a94-8e85-de07aa43fda4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.091 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[60ee7d53-a21e-4b93-9e4c-85814557ff15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.092 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4 namespace which is not needed anymore#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.098 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000070.scope: Deactivated successfully.
Nov 29 02:20:58 np0005539504 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000070.scope: Consumed 2.979s CPU time.
Nov 29 02:20:58 np0005539504 systemd-machined[153423]: Machine qemu-54-instance-00000070 terminated.
Nov 29 02:20:58 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [NOTICE]   (233821) : haproxy version is 2.8.14-c23fe91
Nov 29 02:20:58 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [NOTICE]   (233821) : path to executable is /usr/sbin/haproxy
Nov 29 02:20:58 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [WARNING]  (233821) : Exiting Master process...
Nov 29 02:20:58 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [ALERT]    (233821) : Current worker (233823) exited with code 143 (Terminated)
Nov 29 02:20:58 np0005539504 neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4[233817]: [WARNING]  (233821) : All workers exited. Exiting... (0)
Nov 29 02:20:58 np0005539504 systemd[1]: libpod-9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964.scope: Deactivated successfully.
Nov 29 02:20:58 np0005539504 podman[233877]: 2025-11-29 07:20:58.227620954 +0000 UTC m=+0.049855366 container died 9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:20:58 np0005539504 systemd[1]: var-lib-containers-storage-overlay-84db81950332e4e1adb1e6b63e4e3780983b18cd8bc383053c4fe79da7e57299-merged.mount: Deactivated successfully.
Nov 29 02:20:58 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964-userdata-shm.mount: Deactivated successfully.
Nov 29 02:20:58 np0005539504 podman[233877]: 2025-11-29 07:20:58.26927169 +0000 UTC m=+0.091506112 container cleanup 9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:20:58 np0005539504 systemd[1]: libpod-conmon-9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964.scope: Deactivated successfully.
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.308 187156 INFO nova.virt.libvirt.driver [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Instance destroyed successfully.#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.310 187156 DEBUG nova.objects.instance [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lazy-loading 'resources' on Instance uuid 01e083b9-6c94-4591-acac-3307cbb301c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.325 187156 DEBUG nova.virt.libvirt.vif [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:20:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1892802691',display_name='tempest-ServerAddressesTestJSON-server-1892802691',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1892802691',id=112,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:20:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='be19c52d747e41d1b61aa112facb4cfd',ramdisk_id='',reservation_id='r-63gz2fnc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-807109868',owner_user_name='tempest-ServerAddressesTestJSON-807109868-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:20:57Z,user_data=None,user_id='fc775c91abe1474384c43851c4d778ac',uuid=01e083b9-6c94-4591-acac-3307cbb301c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.325 187156 DEBUG nova.network.os_vif_util [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Converting VIF {"id": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "address": "fa:16:3e:f8:d7:ea", "network": {"id": "3771f30a-11cb-4a94-8e85-de07aa43fda4", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-507831652-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "be19c52d747e41d1b61aa112facb4cfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a63e05b-f7", "ovs_interfaceid": "6a63e05b-f75c-43e1-a545-b83b9813f5f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.326 187156 DEBUG nova.network.os_vif_util [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.326 187156 DEBUG os_vif [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.328 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.329 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a63e05b-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.330 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.331 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.333 187156 INFO os_vif [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d7:ea,bridge_name='br-int',has_traffic_filtering=True,id=6a63e05b-f75c-43e1-a545-b83b9813f5f8,network=Network(3771f30a-11cb-4a94-8e85-de07aa43fda4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a63e05b-f7')#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.334 187156 INFO nova.virt.libvirt.driver [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Deleting instance files /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3_del#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.334 187156 INFO nova.virt.libvirt.driver [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Deletion of /var/lib/nova/instances/01e083b9-6c94-4591-acac-3307cbb301c3_del complete#033[00m
Nov 29 02:20:58 np0005539504 podman[233913]: 2025-11-29 07:20:58.339630574 +0000 UTC m=+0.046195528 container remove 9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.344 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2e88d9-56ab-4f44-a8b7-b468081fc950]: (4, ('Sat Nov 29 07:20:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4 (9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964)\n9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964\nSat Nov 29 07:20:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4 (9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964)\n9a99d9ef066def6f0524c87541048f30650f0d2201f33a347403899cfadfb964\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.346 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[80f0e5ac-67b1-46d8-8add-307b9b51c4e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.347 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3771f30a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:58 np0005539504 kernel: tap3771f30a-10: left promiscuous mode
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.443 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.447 187156 INFO nova.compute.manager [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.448 187156 DEBUG oslo.service.loopingcall [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.448 187156 DEBUG nova.compute.manager [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.449 187156 DEBUG nova.network.neutron [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:20:58 np0005539504 nova_compute[187152]: 2025-11-29 07:20:58.457 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.459 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[21176cc8-f287-4bd2-bfef-ae74ae9494d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.473 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[82d4ef94-c6c6-4f5a-bede-cc2be49ea217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.474 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[988c4481-e9b8-4101-b8b3-4701f4c42416]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.489 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5daaef2c-2867-4d42-a2df-60782d14cdd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 631675, 'reachable_time': 15769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233939, 'error': None, 'target': 'ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539504 systemd[1]: run-netns-ovnmeta\x2d3771f30a\x2d11cb\x2d4a94\x2d8e85\x2dde07aa43fda4.mount: Deactivated successfully.
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.492 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3771f30a-11cb-4a94-8e85-de07aa43fda4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:20:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:20:58.492 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[9091555f-bd4d-403d-8aab-6dc7a4b4c946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.122 187156 DEBUG nova.network.neutron [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.150 187156 INFO nova.compute.manager [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Took 0.70 seconds to deallocate network for instance.#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.221 187156 DEBUG nova.compute.manager [req-0202f691-b0c0-47d7-a154-23d131c12457 req-3d90ac1f-ba2b-4b11-bac4-8c2a582cfead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received event network-vif-deleted-6a63e05b-f75c-43e1-a545-b83b9813f5f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.253 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.253 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.302 187156 DEBUG nova.compute.provider_tree [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.315 187156 DEBUG nova.scheduler.client.report [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.340 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.368 187156 INFO nova.scheduler.client.report [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Deleted allocations for instance 01e083b9-6c94-4591-acac-3307cbb301c3#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.475 187156 DEBUG oslo_concurrency.lockutils [None req-ccfe9a1f-7ba2-471a-aa25-90fed020c778 fc775c91abe1474384c43851c4d778ac be19c52d747e41d1b61aa112facb4cfd - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.758 187156 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received event network-vif-unplugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.759 187156 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.759 187156 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.759 187156 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.759 187156 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] No waiting events found dispatching network-vif-unplugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.760 187156 WARNING nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received unexpected event network-vif-unplugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.760 187156 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.760 187156 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.760 187156 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.761 187156 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "01e083b9-6c94-4591-acac-3307cbb301c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.761 187156 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] No waiting events found dispatching network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:59 np0005539504 nova_compute[187152]: 2025-11-29 07:20:59.761 187156 WARNING nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Received unexpected event network-vif-plugged-6a63e05b-f75c-43e1-a545-b83b9813f5f8 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:21:00 np0005539504 nova_compute[187152]: 2025-11-29 07:21:00.880 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:02 np0005539504 nova_compute[187152]: 2025-11-29 07:21:02.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:03 np0005539504 nova_compute[187152]: 2025-11-29 07:21:03.114 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:03 np0005539504 nova_compute[187152]: 2025-11-29 07:21:03.330 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:03 np0005539504 nova_compute[187152]: 2025-11-29 07:21:03.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:05 np0005539504 nova_compute[187152]: 2025-11-29 07:21:05.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:07 np0005539504 nova_compute[187152]: 2025-11-29 07:21:07.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:07 np0005539504 nova_compute[187152]: 2025-11-29 07:21:07.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:07 np0005539504 nova_compute[187152]: 2025-11-29 07:21:07.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:07 np0005539504 nova_compute[187152]: 2025-11-29 07:21:07.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:21:08 np0005539504 nova_compute[187152]: 2025-11-29 07:21:08.333 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:10.570 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.570 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:10.572 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:21:10 np0005539504 podman[233941]: 2025-11-29 07:21:10.749970689 +0000 UTC m=+0.094251735 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:21:10 np0005539504 podman[233942]: 2025-11-29 07:21:10.765896905 +0000 UTC m=+0.094843120 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Nov 29 02:21:10 np0005539504 podman[233948]: 2025-11-29 07:21:10.766657896 +0000 UTC m=+0.090493105 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.883 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.969 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.970 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.970 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:10 np0005539504 nova_compute[187152]: 2025-11-29 07:21:10.971 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.177 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.179 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5694MB free_disk=73.19254684448242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.179 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.179 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.294 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.294 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.321 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.347 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.374 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:21:11 np0005539504 nova_compute[187152]: 2025-11-29 07:21:11.375 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:13 np0005539504 nova_compute[187152]: 2025-11-29 07:21:13.308 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400858.3066332, 01e083b9-6c94-4591-acac-3307cbb301c3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:13 np0005539504 nova_compute[187152]: 2025-11-29 07:21:13.308 187156 INFO nova.compute.manager [-] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:21:13 np0005539504 nova_compute[187152]: 2025-11-29 07:21:13.337 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:13 np0005539504 nova_compute[187152]: 2025-11-29 07:21:13.368 187156 DEBUG nova.compute.manager [None req-81d0234d-8f10-4813-b0d5-42b9122bde32 - - - - - -] [instance: 01e083b9-6c94-4591-acac-3307cbb301c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:13 np0005539504 podman[234007]: 2025-11-29 07:21:13.713445923 +0000 UTC m=+0.056254379 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:21:13 np0005539504 podman[234008]: 2025-11-29 07:21:13.743266901 +0000 UTC m=+0.084075663 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:21:14 np0005539504 nova_compute[187152]: 2025-11-29 07:21:14.376 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:14 np0005539504 nova_compute[187152]: 2025-11-29 07:21:14.376 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:21:14 np0005539504 nova_compute[187152]: 2025-11-29 07:21:14.377 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:21:14 np0005539504 nova_compute[187152]: 2025-11-29 07:21:14.486 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:21:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:14.574 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:15 np0005539504 nova_compute[187152]: 2025-11-29 07:21:15.885 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:18 np0005539504 nova_compute[187152]: 2025-11-29 07:21:18.339 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:20 np0005539504 nova_compute[187152]: 2025-11-29 07:21:20.887 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:21 np0005539504 podman[234056]: 2025-11-29 07:21:21.737692782 +0000 UTC m=+0.080729303 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:21:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:22.960 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:22.960 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:22.960 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:23 np0005539504 nova_compute[187152]: 2025-11-29 07:21:23.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:25 np0005539504 nova_compute[187152]: 2025-11-29 07:21:25.890 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:28 np0005539504 nova_compute[187152]: 2025-11-29 07:21:28.344 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:28 np0005539504 podman[234077]: 2025-11-29 07:21:28.735639873 +0000 UTC m=+0.074069945 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.890 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.937 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.938 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.938 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.938 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.938 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:30 np0005539504 nova_compute[187152]: 2025-11-29 07:21:30.938 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.262 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.263 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.263 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.263 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.264 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.264 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Removable base files: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.265 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.265 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.266 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.266 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.267 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.267 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 29 02:21:32 np0005539504 nova_compute[187152]: 2025-11-29 07:21:32.268 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 29 02:21:33 np0005539504 nova_compute[187152]: 2025-11-29 07:21:33.348 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:35 np0005539504 nova_compute[187152]: 2025-11-29 07:21:35.892 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:38 np0005539504 nova_compute[187152]: 2025-11-29 07:21:38.351 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:40 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:40Z|00407|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 29 02:21:40 np0005539504 nova_compute[187152]: 2025-11-29 07:21:40.893 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:41 np0005539504 podman[234097]: 2025-11-29 07:21:41.73489293 +0000 UTC m=+0.060291396 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:21:41 np0005539504 podman[234098]: 2025-11-29 07:21:41.735912628 +0000 UTC m=+0.062207938 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6)
Nov 29 02:21:41 np0005539504 podman[234099]: 2025-11-29 07:21:41.739286277 +0000 UTC m=+0.059920086 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:21:43 np0005539504 nova_compute[187152]: 2025-11-29 07:21:43.355 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.201 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.201 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.226 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.378 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.379 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.389 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.390 187156 INFO nova.compute.claims [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.568 187156 DEBUG nova.compute.provider_tree [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.594 187156 DEBUG nova.scheduler.client.report [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.627 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.628 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:21:44 np0005539504 podman[234156]: 2025-11-29 07:21:44.716139408 +0000 UTC m=+0.051305775 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.724 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.725 187156 DEBUG nova.network.neutron [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:21:44 np0005539504 podman[234157]: 2025-11-29 07:21:44.753765666 +0000 UTC m=+0.084180486 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.753 187156 INFO nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.778 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.951 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.952 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.952 187156 INFO nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Creating image(s)#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.953 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.953 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.954 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:44 np0005539504 nova_compute[187152]: 2025-11-29 07:21:44.966 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.020 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.021 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.022 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.037 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.092 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.093 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.124 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.126 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.126 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.183 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.184 187156 DEBUG nova.virt.disk.api [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.184 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.242 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.243 187156 DEBUG nova.virt.disk.api [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.243 187156 DEBUG nova.objects.instance [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.286 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.286 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Ensure instance console log exists: /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.287 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.287 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.288 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.290 187156 DEBUG nova.policy [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:21:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:45.873 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:9b:10 10.100.0.2 2001:db8::f816:3eff:fe36:9b10'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe36:9b10/64', 'neutron:device_id': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6897d2ce-b04d-4d85-9bb6-9da51e7d7f20) old=Port_Binding(mac=['fa:16:3e:36:9b:10 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:45.875 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 updated#033[00m
Nov 29 02:21:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:45.876 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f75dc671-4e0c-40f1-8afd-c16b5e416d95, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:21:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:45.877 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[590de107-91d0-4b0f-8136-ff9dbc600192]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:45 np0005539504 nova_compute[187152]: 2025-11-29 07:21:45.896 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:46 np0005539504 nova_compute[187152]: 2025-11-29 07:21:46.840 187156 DEBUG nova.network.neutron [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Successfully created port: 797bdee3-d774-413a-bebc-e4e84a4055d9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:21:48 np0005539504 nova_compute[187152]: 2025-11-29 07:21:48.357 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:48 np0005539504 nova_compute[187152]: 2025-11-29 07:21:48.634 187156 DEBUG nova.network.neutron [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Successfully updated port: 797bdee3-d774-413a-bebc-e4e84a4055d9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:21:48 np0005539504 nova_compute[187152]: 2025-11-29 07:21:48.654 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:21:48 np0005539504 nova_compute[187152]: 2025-11-29 07:21:48.654 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:21:48 np0005539504 nova_compute[187152]: 2025-11-29 07:21:48.655 187156 DEBUG nova.network.neutron [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:21:49 np0005539504 nova_compute[187152]: 2025-11-29 07:21:49.060 187156 DEBUG nova.compute.manager [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-changed-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:49 np0005539504 nova_compute[187152]: 2025-11-29 07:21:49.061 187156 DEBUG nova.compute.manager [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Refreshing instance network info cache due to event network-changed-797bdee3-d774-413a-bebc-e4e84a4055d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:21:49 np0005539504 nova_compute[187152]: 2025-11-29 07:21:49.061 187156 DEBUG oslo_concurrency.lockutils [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:21:49 np0005539504 nova_compute[187152]: 2025-11-29 07:21:49.221 187156 DEBUG nova.network.neutron [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.744 187156 DEBUG nova.network.neutron [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updating instance_info_cache with network_info: [{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.849 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.849 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance network_info: |[{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.850 187156 DEBUG oslo_concurrency.lockutils [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.851 187156 DEBUG nova.network.neutron [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Refreshing network info cache for port 797bdee3-d774-413a-bebc-e4e84a4055d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.855 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Start _get_guest_xml network_info=[{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.861 187156 WARNING nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.868 187156 DEBUG nova.virt.libvirt.host [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.869 187156 DEBUG nova.virt.libvirt.host [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.882 187156 DEBUG nova.virt.libvirt.host [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.884 187156 DEBUG nova.virt.libvirt.host [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.885 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.886 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.886 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.886 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.887 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.887 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.887 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.887 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.887 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.888 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.888 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.888 187156 DEBUG nova.virt.hardware [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.892 187156 DEBUG nova.virt.libvirt.vif [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-996707846',display_name='tempest-TestNetworkAdvancedServerOps-server-996707846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-996707846',id=114,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBErYDjaq5wSl2W+Prsdf52pa+BbyzQ8k9zEpFvcBMUmZof2lf5CEpW5zCB+o+jSzA6HcpcCjmGi63w6xdaVr0+PFvUpJaeSWrl18PWCMhc6ZJLP06Fdr+z+oANLaw/F/uQ==',key_name='tempest-TestNetworkAdvancedServerOps-1798919802',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-uj3m4a6l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:21:44Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=88218c9c-e4a5-41da-887b-0a5b34b34417,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.893 187156 DEBUG nova.network.os_vif_util [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.893 187156 DEBUG nova.network.os_vif_util [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.894 187156 DEBUG nova.objects.instance [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.899 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.941 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <uuid>88218c9c-e4a5-41da-887b-0a5b34b34417</uuid>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <name>instance-00000072</name>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-996707846</nova:name>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:21:50</nova:creationTime>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        <nova:port uuid="797bdee3-d774-413a-bebc-e4e84a4055d9">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <entry name="serial">88218c9c-e4a5-41da-887b-0a5b34b34417</entry>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <entry name="uuid">88218c9c-e4a5-41da-887b-0a5b34b34417</entry>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:30:00:4c"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <target dev="tap797bdee3-d7"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/console.log" append="off"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:21:50 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:21:50 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:21:50 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:21:50 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.943 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Preparing to wait for external event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.944 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.944 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.944 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.946 187156 DEBUG nova.virt.libvirt.vif [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-996707846',display_name='tempest-TestNetworkAdvancedServerOps-server-996707846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-996707846',id=114,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBErYDjaq5wSl2W+Prsdf52pa+BbyzQ8k9zEpFvcBMUmZof2lf5CEpW5zCB+o+jSzA6HcpcCjmGi63w6xdaVr0+PFvUpJaeSWrl18PWCMhc6ZJLP06Fdr+z+oANLaw/F/uQ==',key_name='tempest-TestNetworkAdvancedServerOps-1798919802',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-uj3m4a6l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:21:44Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=88218c9c-e4a5-41da-887b-0a5b34b34417,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.946 187156 DEBUG nova.network.os_vif_util [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.947 187156 DEBUG nova.network.os_vif_util [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.949 187156 DEBUG os_vif [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.950 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.950 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.951 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.954 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap797bdee3-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.954 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap797bdee3-d7, col_values=(('external_ids', {'iface-id': '797bdee3-d774-413a-bebc-e4e84a4055d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:00:4c', 'vm-uuid': '88218c9c-e4a5-41da-887b-0a5b34b34417'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.955 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:50 np0005539504 NetworkManager[55210]: <info>  [1764400910.9570] manager: (tap797bdee3-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.958 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.963 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:50 np0005539504 nova_compute[187152]: 2025-11-29 07:21:50.964 187156 INFO os_vif [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7')#033[00m
Nov 29 02:21:51 np0005539504 nova_compute[187152]: 2025-11-29 07:21:51.022 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:21:51 np0005539504 nova_compute[187152]: 2025-11-29 07:21:51.022 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:21:51 np0005539504 nova_compute[187152]: 2025-11-29 07:21:51.023 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:30:00:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:21:51 np0005539504 nova_compute[187152]: 2025-11-29 07:21:51.023 187156 INFO nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Using config drive#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.186 187156 INFO nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Creating config drive at /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.191 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps4xmgyz3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.324 187156 DEBUG oslo_concurrency.processutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps4xmgyz3" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:52 np0005539504 kernel: tap797bdee3-d7: entered promiscuous mode
Nov 29 02:21:52 np0005539504 NetworkManager[55210]: <info>  [1764400912.4060] manager: (tap797bdee3-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Nov 29 02:21:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:52Z|00408|binding|INFO|Claiming lport 797bdee3-d774-413a-bebc-e4e84a4055d9 for this chassis.
Nov 29 02:21:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:52Z|00409|binding|INFO|797bdee3-d774-413a-bebc-e4e84a4055d9: Claiming fa:16:3e:30:00:4c 10.100.0.5
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.411 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.425 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:00:4c 10.100.0.5'], port_security=['fa:16:3e:30:00:4c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '25cb26de-7b16-455f-92fc-990d6e904a22', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b23a8aea-d144-4706-9c3e-bfbf05a7ea08, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=797bdee3-d774-413a-bebc-e4e84a4055d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.426 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 797bdee3-d774-413a-bebc-e4e84a4055d9 in datapath 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 bound to our chassis#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.427 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.440 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[07622fb1-65b1-4297-9b8a-ec72e859bc9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.441 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8d6d63fd-d1 in ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:21:52 np0005539504 systemd-udevd[234254]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.443 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8d6d63fd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.443 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a1a7dd-cd8e-4c51-a070-99d3caa669c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.444 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9e8f2f-b059-4f1b-8e74-cf6d958fa8c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 systemd-machined[153423]: New machine qemu-55-instance-00000072.
Nov 29 02:21:52 np0005539504 NetworkManager[55210]: <info>  [1764400912.4593] device (tap797bdee3-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:21:52 np0005539504 NetworkManager[55210]: <info>  [1764400912.4602] device (tap797bdee3-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.466 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd0b00f-b862-4d4a-8390-3b9ca19d459c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 systemd[1]: Started Virtual Machine qemu-55-instance-00000072.
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.470 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:52Z|00410|binding|INFO|Setting lport 797bdee3-d774-413a-bebc-e4e84a4055d9 ovn-installed in OVS
Nov 29 02:21:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:52Z|00411|binding|INFO|Setting lport 797bdee3-d774-413a-bebc-e4e84a4055d9 up in Southbound
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.475 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539504 podman[234232]: 2025-11-29 07:21:52.486337102 +0000 UTC m=+0.087604677 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.491 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c56d14da-467e-4c76-8981-64f95a27d7fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.518 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2fa8c7-a3df-44e4-9d4c-7a1092c881db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.522 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[93b6b2ab-ee83-4fd6-a7ff-59605c839ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 NetworkManager[55210]: <info>  [1764400912.5240] manager: (tap8d6d63fd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/188)
Nov 29 02:21:52 np0005539504 systemd-udevd[234261]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.557 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[42ba6c30-a5ff-4684-8fb8-497045e3106a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.559 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b6982fe7-b6ec-4c0b-bd16-fb3db7c20978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 NetworkManager[55210]: <info>  [1764400912.5820] device (tap8d6d63fd-d0): carrier: link connected
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.585 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[fe68b650-9f0c-4cc6-a39a-18ddb0bc7061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.604 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7889eb-24fd-4aaf-b663-dfd9c785e647]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d6d63fd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:c8:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637546, 'reachable_time': 19032, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234292, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.621 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d187f0b0-bd84-4132-a677-a42afcb66d41]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:c84d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637546, 'tstamp': 637546}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234293, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.643 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[39230ff3-c9dc-484f-a896-777d6b580ca6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d6d63fd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:c8:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637546, 'reachable_time': 19032, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234296, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.683 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4faf56c4-54ab-41cb-9aad-4631a72a4933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.716 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400912.7154279, 88218c9c-e4a5-41da-887b-0a5b34b34417 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.716 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] VM Started (Lifecycle Event)#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.749 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[91711e22-4f3e-4aaf-8fbd-d56e3d809cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.751 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d6d63fd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.751 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.751 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d6d63fd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:52 np0005539504 NetworkManager[55210]: <info>  [1764400912.7540] manager: (tap8d6d63fd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Nov 29 02:21:52 np0005539504 kernel: tap8d6d63fd-d0: entered promiscuous mode
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.755 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.757 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d6d63fd-d0, col_values=(('external_ids', {'iface-id': 'b0033b8e-2fd2-421b-afcc-3340d6ac4b36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.758 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.759 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:52Z|00412|binding|INFO|Releasing lport b0033b8e-2fd2-421b-afcc-3340d6ac4b36 from this chassis (sb_readonly=0)
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.760 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.761 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ea865418-54ed-4e6b-8c43-db83c4f55474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.762 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:21:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:52.763 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'env', 'PROCESS_TAG=haproxy-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.764 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400912.7156057, 88218c9c-e4a5-41da-887b-0a5b34b34417 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.764 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.771 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.831 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.835 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:21:52 np0005539504 nova_compute[187152]: 2025-11-29 07:21:52.863 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:21:53 np0005539504 podman[234333]: 2025-11-29 07:21:53.159957644 +0000 UTC m=+0.069001879 container create 9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:21:53 np0005539504 systemd[1]: Started libpod-conmon-9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978.scope.
Nov 29 02:21:53 np0005539504 podman[234333]: 2025-11-29 07:21:53.129736724 +0000 UTC m=+0.038780989 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:21:53 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:21:53 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ae8e80682ccd2b830d0fbdff43ba3d2f68d678b74b88e655c2b42cca222face/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:21:53 np0005539504 podman[234333]: 2025-11-29 07:21:53.251001663 +0000 UTC m=+0.160045928 container init 9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:21:53 np0005539504 podman[234333]: 2025-11-29 07:21:53.261899065 +0000 UTC m=+0.170943330 container start 9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.306 187156 DEBUG nova.compute.manager [req-82da9525-a02f-4e05-9f26-d7b8603287e5 req-63d8bd54-d6e8-4741-84dc-a1ce5f2488b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.308 187156 DEBUG oslo_concurrency.lockutils [req-82da9525-a02f-4e05-9f26-d7b8603287e5 req-63d8bd54-d6e8-4741-84dc-a1ce5f2488b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.309 187156 DEBUG oslo_concurrency.lockutils [req-82da9525-a02f-4e05-9f26-d7b8603287e5 req-63d8bd54-d6e8-4741-84dc-a1ce5f2488b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.310 187156 DEBUG oslo_concurrency.lockutils [req-82da9525-a02f-4e05-9f26-d7b8603287e5 req-63d8bd54-d6e8-4741-84dc-a1ce5f2488b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.310 187156 DEBUG nova.compute.manager [req-82da9525-a02f-4e05-9f26-d7b8603287e5 req-63d8bd54-d6e8-4741-84dc-a1ce5f2488b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Processing event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.312 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.318 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400913.31781, 88218c9c-e4a5-41da-887b-0a5b34b34417 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:21:53.318 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.318 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:21:53 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [NOTICE]   (234352) : New worker (234354) forked
Nov 29 02:21:53 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [NOTICE]   (234352) : Loading success.
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.321 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.325 187156 INFO nova.virt.libvirt.driver [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance spawned successfully.#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.326 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.362 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.368 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.373 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.374 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.374 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.375 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.375 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.376 187156 DEBUG nova.virt.libvirt.driver [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.410 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.478 187156 INFO nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Took 8.53 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.479 187156 DEBUG nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.632 187156 INFO nova.compute.manager [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Took 9.31 seconds to build instance.#033[00m
Nov 29 02:21:53 np0005539504 nova_compute[187152]: 2025-11-29 07:21:53.663 187156 DEBUG oslo_concurrency.lockutils [None req-6cc4c9c0-8daa-4c2a-ad3f-7ad70d334c17 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.536 187156 DEBUG nova.compute.manager [req-d10dd927-99b2-463e-a984-4693a6a3548c req-79b180f1-cae3-4098-ae92-b95da4e045fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.536 187156 DEBUG oslo_concurrency.lockutils [req-d10dd927-99b2-463e-a984-4693a6a3548c req-79b180f1-cae3-4098-ae92-b95da4e045fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.537 187156 DEBUG oslo_concurrency.lockutils [req-d10dd927-99b2-463e-a984-4693a6a3548c req-79b180f1-cae3-4098-ae92-b95da4e045fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.537 187156 DEBUG oslo_concurrency.lockutils [req-d10dd927-99b2-463e-a984-4693a6a3548c req-79b180f1-cae3-4098-ae92-b95da4e045fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.537 187156 DEBUG nova.compute.manager [req-d10dd927-99b2-463e-a984-4693a6a3548c req-79b180f1-cae3-4098-ae92-b95da4e045fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.537 187156 WARNING nova.compute.manager [req-d10dd927-99b2-463e-a984-4693a6a3548c req-79b180f1-cae3-4098-ae92-b95da4e045fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received unexpected event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.784 187156 DEBUG nova.network.neutron [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updated VIF entry in instance network info cache for port 797bdee3-d774-413a-bebc-e4e84a4055d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.785 187156 DEBUG nova.network.neutron [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updating instance_info_cache with network_info: [{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.806 187156 DEBUG oslo_concurrency.lockutils [req-8eea46fa-3809-434a-818f-bcf96624e313 req-2e91202c-b618-4121-a12e-d7b411e42237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.901 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:55 np0005539504 nova_compute[187152]: 2025-11-29 07:21:55.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:57 np0005539504 nova_compute[187152]: 2025-11-29 07:21:57.269 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:59 np0005539504 NetworkManager[55210]: <info>  [1764400919.3423] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Nov 29 02:21:59 np0005539504 NetworkManager[55210]: <info>  [1764400919.3437] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Nov 29 02:21:59 np0005539504 nova_compute[187152]: 2025-11-29 07:21:59.342 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:59 np0005539504 nova_compute[187152]: 2025-11-29 07:21:59.518 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:59 np0005539504 ovn_controller[95182]: 2025-11-29T07:21:59Z|00413|binding|INFO|Releasing lport b0033b8e-2fd2-421b-afcc-3340d6ac4b36 from this chassis (sb_readonly=0)
Nov 29 02:21:59 np0005539504 nova_compute[187152]: 2025-11-29 07:21:59.537 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:59 np0005539504 podman[234364]: 2025-11-29 07:21:59.725899535 +0000 UTC m=+0.064475388 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:22:00 np0005539504 nova_compute[187152]: 2025-11-29 07:22:00.903 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:00 np0005539504 nova_compute[187152]: 2025-11-29 07:22:00.959 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:02 np0005539504 nova_compute[187152]: 2025-11-29 07:22:02.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:03 np0005539504 nova_compute[187152]: 2025-11-29 07:22:03.449 187156 DEBUG nova.compute.manager [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-changed-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:03 np0005539504 nova_compute[187152]: 2025-11-29 07:22:03.450 187156 DEBUG nova.compute.manager [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Refreshing instance network info cache due to event network-changed-797bdee3-d774-413a-bebc-e4e84a4055d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:22:03 np0005539504 nova_compute[187152]: 2025-11-29 07:22:03.450 187156 DEBUG oslo_concurrency.lockutils [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:22:03 np0005539504 nova_compute[187152]: 2025-11-29 07:22:03.451 187156 DEBUG oslo_concurrency.lockutils [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:22:03 np0005539504 nova_compute[187152]: 2025-11-29 07:22:03.451 187156 DEBUG nova.network.neutron [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Refreshing network info cache for port 797bdee3-d774-413a-bebc-e4e84a4055d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:22:05 np0005539504 nova_compute[187152]: 2025-11-29 07:22:05.905 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:05 np0005539504 nova_compute[187152]: 2025-11-29 07:22:05.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:05 np0005539504 nova_compute[187152]: 2025-11-29 07:22:05.960 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:06Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:00:4c 10.100.0.5
Nov 29 02:22:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:06Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:00:4c 10.100.0.5
Nov 29 02:22:07 np0005539504 nova_compute[187152]: 2025-11-29 07:22:07.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:07 np0005539504 nova_compute[187152]: 2025-11-29 07:22:07.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:22:08 np0005539504 nova_compute[187152]: 2025-11-29 07:22:08.234 187156 DEBUG nova.network.neutron [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updated VIF entry in instance network info cache for port 797bdee3-d774-413a-bebc-e4e84a4055d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:22:08 np0005539504 nova_compute[187152]: 2025-11-29 07:22:08.234 187156 DEBUG nova.network.neutron [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updating instance_info_cache with network_info: [{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:22:08 np0005539504 nova_compute[187152]: 2025-11-29 07:22:08.260 187156 DEBUG oslo_concurrency.lockutils [req-74957bfa-0d90-4ef1-ba54-0d9b48d67477 req-17e9bfa4-4a4b-4e64-9f1a-33d0864037a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:22:08 np0005539504 nova_compute[187152]: 2025-11-29 07:22:08.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:09 np0005539504 nova_compute[187152]: 2025-11-29 07:22:09.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:10 np0005539504 nova_compute[187152]: 2025-11-29 07:22:10.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:10 np0005539504 nova_compute[187152]: 2025-11-29 07:22:10.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:10 np0005539504 nova_compute[187152]: 2025-11-29 07:22:10.962 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.499 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.500 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.501 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.501 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.503 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.579 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:12 np0005539504 podman[234399]: 2025-11-29 07:22:12.594206384 +0000 UTC m=+0.061689823 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:22:12 np0005539504 podman[234401]: 2025-11-29 07:22:12.598195652 +0000 UTC m=+0.059452004 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.603 187156 INFO nova.compute.manager [None req-b215edaf-b32c-473a-a70d-00b54e9a98de bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Get console output#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.610 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:22:12 np0005539504 podman[234400]: 2025-11-29 07:22:12.619349328 +0000 UTC m=+0.086611051 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.650 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.651 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.705 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.859 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.861 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5542MB free_disk=73.16384887695312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.861 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:12 np0005539504 nova_compute[187152]: 2025-11-29 07:22:12.862 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.091 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 88218c9c-e4a5-41da-887b-0a5b34b34417 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.092 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.092 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.331 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.369 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.424 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:22:13 np0005539504 nova_compute[187152]: 2025-11-29 07:22:13.425 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.232 187156 INFO nova.compute.manager [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Rebuilding instance#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.646 187156 DEBUG nova.compute.manager [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.721 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_requests' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.740 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.794 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.818 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.841 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:22:14 np0005539504 nova_compute[187152]: 2025-11-29 07:22:14.845 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:22:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:15.285 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:15.286 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.353 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.425 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.624 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.624 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.624 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.638 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.638 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.639 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.639 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:15 np0005539504 podman[234470]: 2025-11-29 07:22:15.746122834 +0000 UTC m=+0.080113576 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:22:15 np0005539504 podman[234469]: 2025-11-29 07:22:15.751502958 +0000 UTC m=+0.086499088 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.959 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:15 np0005539504 nova_compute[187152]: 2025-11-29 07:22:15.963 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.289 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:17 np0005539504 kernel: tap797bdee3-d7 (unregistering): left promiscuous mode
Nov 29 02:22:17 np0005539504 NetworkManager[55210]: <info>  [1764400937.4081] device (tap797bdee3-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.415 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:17Z|00414|binding|INFO|Releasing lport 797bdee3-d774-413a-bebc-e4e84a4055d9 from this chassis (sb_readonly=0)
Nov 29 02:22:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:17Z|00415|binding|INFO|Setting lport 797bdee3-d774-413a-bebc-e4e84a4055d9 down in Southbound
Nov 29 02:22:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:17Z|00416|binding|INFO|Removing iface tap797bdee3-d7 ovn-installed in OVS
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.417 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.429 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:00:4c 10.100.0.5'], port_security=['fa:16:3e:30:00:4c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '25cb26de-7b16-455f-92fc-990d6e904a22', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b23a8aea-d144-4706-9c3e-bfbf05a7ea08, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=797bdee3-d774-413a-bebc-e4e84a4055d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.432 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 797bdee3-d774-413a-bebc-e4e84a4055d9 in datapath 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 unbound from our chassis#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.434 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.435 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.436 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae6844b-0c57-463a-a98b-a63e106117c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.437 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 namespace which is not needed anymore#033[00m
Nov 29 02:22:17 np0005539504 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000072.scope: Deactivated successfully.
Nov 29 02:22:17 np0005539504 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000072.scope: Consumed 13.064s CPU time.
Nov 29 02:22:17 np0005539504 systemd-machined[153423]: Machine qemu-55-instance-00000072 terminated.
Nov 29 02:22:17 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [NOTICE]   (234352) : haproxy version is 2.8.14-c23fe91
Nov 29 02:22:17 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [NOTICE]   (234352) : path to executable is /usr/sbin/haproxy
Nov 29 02:22:17 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [WARNING]  (234352) : Exiting Master process...
Nov 29 02:22:17 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [ALERT]    (234352) : Current worker (234354) exited with code 143 (Terminated)
Nov 29 02:22:17 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234348]: [WARNING]  (234352) : All workers exited. Exiting... (0)
Nov 29 02:22:17 np0005539504 systemd[1]: libpod-9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978.scope: Deactivated successfully.
Nov 29 02:22:17 np0005539504 podman[234541]: 2025-11-29 07:22:17.581756769 +0000 UTC m=+0.057300766 container died 9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:22:17 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978-userdata-shm.mount: Deactivated successfully.
Nov 29 02:22:17 np0005539504 systemd[1]: var-lib-containers-storage-overlay-3ae8e80682ccd2b830d0fbdff43ba3d2f68d678b74b88e655c2b42cca222face-merged.mount: Deactivated successfully.
Nov 29 02:22:17 np0005539504 podman[234541]: 2025-11-29 07:22:17.619408678 +0000 UTC m=+0.094952675 container cleanup 9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 02:22:17 np0005539504 systemd[1]: libpod-conmon-9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978.scope: Deactivated successfully.
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 podman[234571]: 2025-11-29 07:22:17.697779987 +0000 UTC m=+0.054701016 container remove 9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.703 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0635cae6-207f-440c-be05-4a01d671d643]: (4, ('Sat Nov 29 07:22:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 (9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978)\n9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978\nSat Nov 29 07:22:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 (9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978)\n9a32a96d71823b186f01a5ed75d22cb63f7ae44eeb93e128f712e4aaf3c75978\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.705 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9843535b-8536-48cd-aa51-01ea8d7bf93c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.706 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d6d63fd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.708 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 kernel: tap8d6d63fd-d0: left promiscuous mode
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.723 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updating instance_info_cache with network_info: [{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.724 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.729 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d3fc91d0-4a1b-4dc5-b8a6-a2dce10e1e72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.752 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a4312143-087a-46a6-a3be-c973c7630d10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.753 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[633cdd34-c69c-4eda-994e-9b0f7b55c1a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.762 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.763 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.763 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.770 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97858052-a111-4526-b97b-fd5db88f9348]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637539, 'reachable_time': 41742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234603, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 systemd[1]: run-netns-ovnmeta\x2d8d6d63fd\x2ddd9c\x2d49a1\x2dae26\x2d5b06d08155e2.mount: Deactivated successfully.
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.775 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:22:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:17.775 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[dd251991-bc3e-4170-bd8e-309fe889a319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.861 187156 INFO nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.867 187156 INFO nova.virt.libvirt.driver [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance destroyed successfully.#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.871 187156 INFO nova.virt.libvirt.driver [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance destroyed successfully.#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.872 187156 DEBUG nova.virt.libvirt.vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-996707846',display_name='tempest-TestNetworkAdvancedServerOps-server-996707846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-996707846',id=114,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBErYDjaq5wSl2W+Prsdf52pa+BbyzQ8k9zEpFvcBMUmZof2lf5CEpW5zCB+o+jSzA6HcpcCjmGi63w6xdaVr0+PFvUpJaeSWrl18PWCMhc6ZJLP06Fdr+z+oANLaw/F/uQ==',key_name='tempest-TestNetworkAdvancedServerOps-1798919802',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:21:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-uj3m4a6l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:13Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=88218c9c-e4a5-41da-887b-0a5b34b34417,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.872 187156 DEBUG nova.network.os_vif_util [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.873 187156 DEBUG nova.network.os_vif_util [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.873 187156 DEBUG os_vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.875 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.876 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap797bdee3-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.878 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.880 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.884 187156 INFO os_vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7')#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.884 187156 INFO nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Deleting instance files /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417_del#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.885 187156 INFO nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Deletion of /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417_del complete#033[00m
Nov 29 02:22:17 np0005539504 nova_compute[187152]: 2025-11-29 07:22:17.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.192 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.193 187156 INFO nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Creating image(s)#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.194 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.194 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.195 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.211 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.272 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.273 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.274 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.292 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.356 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.357 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.394 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.395 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.396 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.453 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.454 187156 DEBUG nova.virt.disk.api [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.455 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.520 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.521 187156 DEBUG nova.virt.disk.api [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.522 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.522 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Ensure instance console log exists: /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.523 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.523 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.523 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.526 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Start _get_guest_xml network_info=[{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.530 187156 WARNING nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.535 187156 DEBUG nova.virt.libvirt.host [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.536 187156 DEBUG nova.virt.libvirt.host [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.539 187156 DEBUG nova.virt.libvirt.host [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.540 187156 DEBUG nova.virt.libvirt.host [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.541 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.542 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.542 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.542 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.543 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.543 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.543 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.543 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.543 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.544 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.544 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.544 187156 DEBUG nova.virt.hardware [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.544 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.568 187156 DEBUG nova.virt.libvirt.vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-996707846',display_name='tempest-TestNetworkAdvancedServerOps-server-996707846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-996707846',id=114,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBErYDjaq5wSl2W+Prsdf52pa+BbyzQ8k9zEpFvcBMUmZof2lf5CEpW5zCB+o+jSzA6HcpcCjmGi63w6xdaVr0+PFvUpJaeSWrl18PWCMhc6ZJLP06Fdr+z+oANLaw/F/uQ==',key_name='tempest-TestNetworkAdvancedServerOps-1798919802',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:21:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-uj3m4a6l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:18Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=88218c9c-e4a5-41da-887b-0a5b34b34417,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.569 187156 DEBUG nova.network.os_vif_util [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.570 187156 DEBUG nova.network.os_vif_util [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.572 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <uuid>88218c9c-e4a5-41da-887b-0a5b34b34417</uuid>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <name>instance-00000072</name>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-996707846</nova:name>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:22:18</nova:creationTime>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        <nova:port uuid="797bdee3-d774-413a-bebc-e4e84a4055d9">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <entry name="serial">88218c9c-e4a5-41da-887b-0a5b34b34417</entry>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <entry name="uuid">88218c9c-e4a5-41da-887b-0a5b34b34417</entry>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:22:18 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:30:00:4c"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <target dev="tap797bdee3-d7"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/console.log" append="off"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:22:18 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:22:18 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:22:18 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:22:18 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.572 187156 DEBUG nova.virt.libvirt.vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-996707846',display_name='tempest-TestNetworkAdvancedServerOps-server-996707846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-996707846',id=114,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBErYDjaq5wSl2W+Prsdf52pa+BbyzQ8k9zEpFvcBMUmZof2lf5CEpW5zCB+o+jSzA6HcpcCjmGi63w6xdaVr0+PFvUpJaeSWrl18PWCMhc6ZJLP06Fdr+z+oANLaw/F/uQ==',key_name='tempest-TestNetworkAdvancedServerOps-1798919802',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:21:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-uj3m4a6l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:18Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=88218c9c-e4a5-41da-887b-0a5b34b34417,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.573 187156 DEBUG nova.network.os_vif_util [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.573 187156 DEBUG nova.network.os_vif_util [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.573 187156 DEBUG os_vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.574 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.574 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.575 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.577 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.578 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap797bdee3-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.578 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap797bdee3-d7, col_values=(('external_ids', {'iface-id': '797bdee3-d774-413a-bebc-e4e84a4055d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:00:4c', 'vm-uuid': '88218c9c-e4a5-41da-887b-0a5b34b34417'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.579 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:18 np0005539504 NetworkManager[55210]: <info>  [1764400938.5806] manager: (tap797bdee3-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.582 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:22:18 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.585 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.586 187156 INFO os_vif [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7')#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.650 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.651 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.651 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:30:00:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.651 187156 INFO nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Using config drive#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.684 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.720 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'keypairs' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.760 187156 DEBUG nova.compute.manager [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-unplugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.760 187156 DEBUG oslo_concurrency.lockutils [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.760 187156 DEBUG oslo_concurrency.lockutils [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.761 187156 DEBUG oslo_concurrency.lockutils [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.761 187156 DEBUG nova.compute.manager [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-unplugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.761 187156 WARNING nova.compute.manager [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received unexpected event network-vif-unplugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.761 187156 DEBUG nova.compute.manager [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.761 187156 DEBUG oslo_concurrency.lockutils [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.762 187156 DEBUG oslo_concurrency.lockutils [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.762 187156 DEBUG oslo_concurrency.lockutils [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.762 187156 DEBUG nova.compute.manager [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:18 np0005539504 nova_compute[187152]: 2025-11-29 07:22:18.762 187156 WARNING nova.compute.manager [req-8c351691-7832-4af7-9ca7-a59e919d3abc req-a6e2d354-3161-492c-8ee6-a57c94a16590 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received unexpected event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:22:20 np0005539504 nova_compute[187152]: 2025-11-29 07:22:20.962 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.246 187156 INFO nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Creating config drive at /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.256 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9qdhb2r_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.386 187156 DEBUG oslo_concurrency.processutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9qdhb2r_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:21 np0005539504 kernel: tap797bdee3-d7: entered promiscuous mode
Nov 29 02:22:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:21Z|00417|binding|INFO|Claiming lport 797bdee3-d774-413a-bebc-e4e84a4055d9 for this chassis.
Nov 29 02:22:21 np0005539504 NetworkManager[55210]: <info>  [1764400941.5024] manager: (tap797bdee3-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Nov 29 02:22:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:21Z|00418|binding|INFO|797bdee3-d774-413a-bebc-e4e84a4055d9: Claiming fa:16:3e:30:00:4c 10.100.0.5
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.500 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:21Z|00419|binding|INFO|Setting lport 797bdee3-d774-413a-bebc-e4e84a4055d9 ovn-installed in OVS
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.521 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:21Z|00420|binding|INFO|Setting lport 797bdee3-d774-413a-bebc-e4e84a4055d9 up in Southbound
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.524 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:00:4c 10.100.0.5'], port_security=['fa:16:3e:30:00:4c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '5', 'neutron:security_group_ids': '25cb26de-7b16-455f-92fc-990d6e904a22', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b23a8aea-d144-4706-9c3e-bfbf05a7ea08, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=797bdee3-d774-413a-bebc-e4e84a4055d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:21 np0005539504 systemd-udevd[234637]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.525 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 797bdee3-d774-413a-bebc-e4e84a4055d9 in datapath 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 bound to our chassis#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.528 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.535 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 NetworkManager[55210]: <info>  [1764400941.5422] device (tap797bdee3-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:22:21 np0005539504 NetworkManager[55210]: <info>  [1764400941.5432] device (tap797bdee3-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.542 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[634517b8-775a-4553-91cf-37e6a81da569]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.543 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8d6d63fd-d1 in ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.545 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8d6d63fd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.545 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9a5b35-361a-481b-afa9-187dd4c8bb86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.546 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1e00fcff-4041-4878-8935-92008d9006ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 systemd-machined[153423]: New machine qemu-56-instance-00000072.
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.558 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbc2ff8-39aa-4282-8a82-7907ec43bfd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 systemd[1]: Started Virtual Machine qemu-56-instance-00000072.
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.580 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e5176b3b-ed35-474d-accf-4fdf5d4ad20f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.626 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[49dd8ec9-8a25-4171-8d11-b4bcf587d85c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.633 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[60a5606a-3b61-4f10-b4c7-cbab12541da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 NetworkManager[55210]: <info>  [1764400941.6343] manager: (tap8d6d63fd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/194)
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.672 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc091f9-9aae-4551-8d13-5e69a6d84580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.675 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2f35b1-18f9-4ad3-8c46-a4a18d2062ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 NetworkManager[55210]: <info>  [1764400941.6991] device (tap8d6d63fd-d0): carrier: link connected
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.703 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a54b855a-9ccf-4b77-b423-a5d982b01100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.727 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4745be-41ff-4c7f-88f8-245904d9a70a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d6d63fd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:c8:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640457, 'reachable_time': 29691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234673, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.749 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bd040180-f68f-441b-9420-96809c5ce394]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:c84d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 640457, 'tstamp': 640457}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234674, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.771 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bd22be7d-3fec-4b91-8c3d-4bf77bffc931]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d6d63fd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:c8:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640457, 'reachable_time': 29691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234677, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.814 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c73bc0cf-8571-43bf-b47e-e4580c2d0dbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.880 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 88218c9c-e4a5-41da-887b-0a5b34b34417 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.882 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400941.8795836, 88218c9c-e4a5-41da-887b-0a5b34b34417 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.882 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.888 187156 DEBUG nova.compute.manager [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.889 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.896 187156 INFO nova.virt.libvirt.driver [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance spawned successfully.#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.897 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.902 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[06a584ee-738a-4c78-9789-300af7e999fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.904 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d6d63fd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.904 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d6d63fd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.908 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 kernel: tap8d6d63fd-d0: entered promiscuous mode
Nov 29 02:22:21 np0005539504 NetworkManager[55210]: <info>  [1764400941.9093] manager: (tap8d6d63fd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.912 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.914 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d6d63fd-d0, col_values=(('external_ids', {'iface-id': 'b0033b8e-2fd2-421b-afcc-3340d6ac4b36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.915 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:21Z|00421|binding|INFO|Releasing lport b0033b8e-2fd2-421b-afcc-3340d6ac4b36 from this chassis (sb_readonly=0)
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.940 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.942 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.943 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6faa155d-39ef-45e3-bc40-944015ff26ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.945 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.pid.haproxy
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:22:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:21.945 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'env', 'PROCESS_TAG=haproxy-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8d6d63fd-dd9c-49a1-ae26-5b06d08155e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.963 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:21 np0005539504 nova_compute[187152]: 2025-11-29 07:22:21.964 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.041 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.051 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.055 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.056 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.056 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.057 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.057 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.057 187156 DEBUG nova.virt.libvirt.driver [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.269 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.271 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400941.8812077, 88218c9c-e4a5-41da-887b-0a5b34b34417 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.271 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] VM Started (Lifecycle Event)#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.442 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.447 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:22:22 np0005539504 podman[234713]: 2025-11-29 07:22:22.347049692 +0000 UTC m=+0.021967050 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:22:22 np0005539504 podman[234713]: 2025-11-29 07:22:22.513175931 +0000 UTC m=+0.188093289 container create 8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.516 187156 DEBUG nova.compute.manager [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.534 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:22:22 np0005539504 systemd[1]: Started libpod-conmon-8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b.scope.
Nov 29 02:22:22 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.605 187156 DEBUG nova.compute.manager [req-2c4b8052-d9c7-4127-9735-2d3de54835e5 req-9d926f00-0a52-4dd4-99a1-c04d0fd6b78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.605 187156 DEBUG oslo_concurrency.lockutils [req-2c4b8052-d9c7-4127-9735-2d3de54835e5 req-9d926f00-0a52-4dd4-99a1-c04d0fd6b78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.606 187156 DEBUG oslo_concurrency.lockutils [req-2c4b8052-d9c7-4127-9735-2d3de54835e5 req-9d926f00-0a52-4dd4-99a1-c04d0fd6b78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.606 187156 DEBUG oslo_concurrency.lockutils [req-2c4b8052-d9c7-4127-9735-2d3de54835e5 req-9d926f00-0a52-4dd4-99a1-c04d0fd6b78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.606 187156 DEBUG nova.compute.manager [req-2c4b8052-d9c7-4127-9735-2d3de54835e5 req-9d926f00-0a52-4dd4-99a1-c04d0fd6b78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.606 187156 WARNING nova.compute.manager [req-2c4b8052-d9c7-4127-9735-2d3de54835e5 req-9d926f00-0a52-4dd4-99a1-c04d0fd6b78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received unexpected event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:22:22 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cd6dd9dc42dffbf39faaa216071198747663759defc95083961da8c778ea506/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:22:22 np0005539504 podman[234713]: 2025-11-29 07:22:22.632730254 +0000 UTC m=+0.307647602 container init 8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:22:22 np0005539504 podman[234726]: 2025-11-29 07:22:22.63408127 +0000 UTC m=+0.074113446 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 29 02:22:22 np0005539504 podman[234713]: 2025-11-29 07:22:22.638506238 +0000 UTC m=+0.313423566 container start 8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:22:22 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [NOTICE]   (234751) : New worker (234753) forked
Nov 29 02:22:22 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [NOTICE]   (234751) : Loading success.
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.677 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.677 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.677 187156 DEBUG nova.objects.instance [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:22:22 np0005539504 nova_compute[187152]: 2025-11-29 07:22:22.783 187156 DEBUG oslo_concurrency.lockutils [None req-effb5ca2-f893-4da0-80aa-6ca29ec5b3b4 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:22.961 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:22.962 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:22.962 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:23 np0005539504 nova_compute[187152]: 2025-11-29 07:22:23.633 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:24 np0005539504 nova_compute[187152]: 2025-11-29 07:22:24.741 187156 DEBUG nova.compute.manager [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:24 np0005539504 nova_compute[187152]: 2025-11-29 07:22:24.742 187156 DEBUG oslo_concurrency.lockutils [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:24 np0005539504 nova_compute[187152]: 2025-11-29 07:22:24.742 187156 DEBUG oslo_concurrency.lockutils [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:24 np0005539504 nova_compute[187152]: 2025-11-29 07:22:24.742 187156 DEBUG oslo_concurrency.lockutils [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:24 np0005539504 nova_compute[187152]: 2025-11-29 07:22:24.742 187156 DEBUG nova.compute.manager [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:24 np0005539504 nova_compute[187152]: 2025-11-29 07:22:24.743 187156 WARNING nova.compute.manager [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received unexpected event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:22:25 np0005539504 nova_compute[187152]: 2025-11-29 07:22:25.964 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:28 np0005539504 nova_compute[187152]: 2025-11-29 07:22:28.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:30 np0005539504 podman[234762]: 2025-11-29 07:22:30.748499344 +0000 UTC m=+0.079276405 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:22:30 np0005539504 nova_compute[187152]: 2025-11-29 07:22:30.969 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:33 np0005539504 nova_compute[187152]: 2025-11-29 07:22:33.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:34Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:00:4c 10.100.0.5
Nov 29 02:22:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:34Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:00:4c 10.100.0.5
Nov 29 02:22:35 np0005539504 nova_compute[187152]: 2025-11-29 07:22:35.971 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:38 np0005539504 nova_compute[187152]: 2025-11-29 07:22:38.649 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:40 np0005539504 nova_compute[187152]: 2025-11-29 07:22:40.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:41 np0005539504 nova_compute[187152]: 2025-11-29 07:22:41.005 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:41 np0005539504 nova_compute[187152]: 2025-11-29 07:22:41.006 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:22:42 np0005539504 podman[234808]: 2025-11-29 07:22:42.72646721 +0000 UTC m=+0.057343087 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:22:42 np0005539504 podman[234806]: 2025-11-29 07:22:42.736411156 +0000 UTC m=+0.071160027 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:22:42 np0005539504 podman[234807]: 2025-11-29 07:22:42.739793617 +0000 UTC m=+0.069996956 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter)
Nov 29 02:22:43 np0005539504 nova_compute[187152]: 2025-11-29 07:22:43.653 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:45 np0005539504 nova_compute[187152]: 2025-11-29 07:22:45.483 187156 INFO nova.compute.manager [None req-52005680-4b27-4c83-a806-52f52c8914cd bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Get console output#033[00m
Nov 29 02:22:45 np0005539504 nova_compute[187152]: 2025-11-29 07:22:45.489 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:22:45 np0005539504 nova_compute[187152]: 2025-11-29 07:22:45.618 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:22:45 np0005539504 nova_compute[187152]: 2025-11-29 07:22:45.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:46 np0005539504 podman[234868]: 2025-11-29 07:22:46.719778006 +0000 UTC m=+0.061690453 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:22:46 np0005539504 podman[234869]: 2025-11-29 07:22:46.754506656 +0000 UTC m=+0.094350988 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 02:22:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:47.975 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000072', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:22:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:47.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.006 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.read.latency volume: 578523626 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.007 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.read.latency volume: 31284144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4465ef65-5c48-46df-ab0c-c10b1bc61a7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 578523626, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:47.977100', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34e6359c-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'e22d5878a74ada081e7b789b3a813e54ceb725c519bf2a57bf508307952c8d02'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31284144, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:47.977100', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34e64f78-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'd66b3ed03e0c98bdb0c0f74ef02b2eb7c524dc379000eaf10a1787e54582af4d'}]}, 'timestamp': '2025-11-29 07:22:48.008266', '_unique_id': '4cb224c6143f427eb212410439287a9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.010 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.013 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.013 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>]
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.014 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.write.latency volume: 5214718069 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.014 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3c1dfcd-6206-44c3-abac-95ec7da6c53f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5214718069, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.014150', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34e74f22-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'a4bc0c3363e9d532b50852d981de9a35f16154d7b87c4a572f42cbfd370c84dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.014150', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34e75d6e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': '0d5f33482687f085cf3c90a2f2882fb5b6b360a17298dbf09e7bb8d6312ff334'}]}, 'timestamp': '2025-11-29 07:22:48.015127', '_unique_id': '4a5fffd022224607ab8e585d1adf1329'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.015 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.017 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.017 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '064f52b5-2d3b-439d-8d6c-3a2032a5a0a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.017080', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34e7b5de-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'd461f6d1130c9560b493437a0728e7acd63a6bd649bb3fe57fb24da7f53bbb0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.017080', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34e7c3bc-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'e8535309614cc60753a949f23ce13948b79ab41662dd9dee63558245b8d21aa8'}]}, 'timestamp': '2025-11-29 07:22:48.017743', '_unique_id': '124cb9ac0a204cb9bf2cfa196bd8f88f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.020 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.read.bytes volume: 30697984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.021 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb7af144-a5a4-4d32-85f2-3a227e0adcef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30697984, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.020694', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34e84ce2-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': '5624330555dd85c22c5076516834b444ca81221989b3657db74eeaf237a98f78'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.020694', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34e8665a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': '9a996be58ec0f9e2560a720ba4a8b38b6d995d00f6f9b45350a9782f4d1f2215'}]}, 'timestamp': '2025-11-29 07:22:48.021999', '_unique_id': 'f2290c734aa245578fd05dd965721d09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.023 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.026 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.026 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>]
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.041 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.042 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46cb1909-30a2-404e-85bc-3739e7e375b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.027480', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34eb8272-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.962213064, 'message_signature': '63c8955430150a7f4eab718a3a321d6b26a5ad984aace411a4cb35df9cf3568e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.027480', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34eb9dde-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.962213064, 'message_signature': '5ced4aa5c1219f1b506d7604df7edff605f4a762fbeb64fe58b7b19ad836bac2'}]}, 'timestamp': '2025-11-29 07:22:48.043082', '_unique_id': 'b1f1d9b3ed3d4fe9b52af845a758adbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.046 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.047 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.047 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>]
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.049 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 88218c9c-e4a5-41da-887b-0a5b34b34417 / tap797bdee3-d7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.050 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6273d5e-cccb-47f0-b739-bf03eef6596a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.047368', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34ecc6c8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': '178c92b62f7fda297a0670ceb2e29e0d9be022e98370a653e3ed0075f4f8399b'}]}, 'timestamp': '2025-11-29 07:22:48.050905', '_unique_id': 'd7710669562f46218552114a3417a0e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.053 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.incoming.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42bd039c-24a6-41af-a9f1-ef9fd34c1d6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.053777', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34ed5214-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': '96f5eba72b0f27fc725da383f5c8bd3c1bc238c0e7def03d2d21f5d699a90250'}]}, 'timestamp': '2025-11-29 07:22:48.054256', '_unique_id': '1a641c8cd39f46e1ac5207e47e7213c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.055 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.056 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.incoming.bytes volume: 4781 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab734c60-0a43-4645-92e5-401634179707', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4781, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.056650', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34edc24e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': '7596c0caa9ad7c0e59ef3f7c10b04872679b912f8c500b55f9a92c167388866b'}]}, 'timestamp': '2025-11-29 07:22:48.057131', '_unique_id': 'e5e7a6d86b8e4f1fa81693015b37352e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.059 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.059 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba91b276-20f8-429c-a38a-14dfed25b32a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.059504', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34ee31ca-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.962213064, 'message_signature': 'd2448dbbb108c161f2335d8bf234c19fd9c83d743a1d7eb48f36cb75dc69acdc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.059504', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34ee4232-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.962213064, 'message_signature': 'a0e15c6a835598c80197dd9a0a0b9ab9dae3b0994500436317912b5f8708555c'}]}, 'timestamp': '2025-11-29 07:22:48.060366', '_unique_id': '84c03a0ef067493f9fc52e040ac6757b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.063 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b1667c7-824b-4d4e-92db-bbe7bfc0a3bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.063019', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34eebabe-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': '6c6e905d8cc12018d2ab2f30a5f0d6cb6f87cb5151c08fedbbad8261001d89aa'}]}, 'timestamp': '2025-11-29 07:22:48.063521', '_unique_id': '13bddede651e493c93a527ccdf33ae36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.064 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.066 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.write.requests volume: 302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.067 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2528264e-a90f-4fdc-96ac-de71beb6ea48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 302, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.066514', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34ef4894-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'fa1f7c1a7ff4695b1007e4679bc6a9db438aee783f680705e005256f32677972'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.066514', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34ef5b04-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': '588e9d0a22787b1c582401440e6ee0f27b85653f5807542a25220d387b9775e4'}]}, 'timestamp': '2025-11-29 07:22:48.067585', '_unique_id': '984da8b36ba642d796d897860ca75b51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.070 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.070 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-996707846>]
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.071 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.072 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40b4492d-d621-4742-9514-2af234c4ef27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.071514', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34f009a0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.962213064, 'message_signature': 'dbb7baad6336e3483e129001fccdc7ad00978205a4d94a0786294e6f03f1248c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.071514', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34f01cd8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.962213064, 'message_signature': 'c03b64d6e353e45abc90f13fa019e3633d4104ea04075a2653ec44d674419f1f'}]}, 'timestamp': '2025-11-29 07:22:48.072546', '_unique_id': '272d4233de904c0884152dc2866890e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.075 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a3dd15b-bd58-40cb-bf3f-8245ea53d8d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.075594', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34f0a8f6-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': '3bc714a7244c62fc78bef16000b469f9a5169b22c3e8daca07f29e6dfa08416d'}]}, 'timestamp': '2025-11-29 07:22:48.076130', '_unique_id': 'f47016822e554e848cb647aa2052694c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.077 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.079 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed5d008d-3c7b-4236-9de1-3c58c7010114', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.078964', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34f12d08-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': '556b4af12901330763bd0c19a5ddc0c9e77c4ccf803d73ee68f916d40cf09c83'}]}, 'timestamp': '2025-11-29 07:22:48.079534', '_unique_id': '22644342cd5d4ad1a064718ff21f1f01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.084 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c2abeac-ffce-4e75-bf51-fbfb70c4fdaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.084665', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34f20b92-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': 'bcf99c461cb8f8a5332997bc7e3fac572d0dc5a207d5c20c79fbbe98b99d993f'}]}, 'timestamp': '2025-11-29 07:22:48.085214', '_unique_id': '07501398586a43feb84c7188b5fe84f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.104 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/cpu volume: 12240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc177aa3-e65e-4252-8f94-f038f06f86c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12240000000, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'timestamp': '2025-11-29T07:22:48.088104', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '34f50478-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6431.038621281, 'message_signature': 'e90c5d84182a0d8f7f17ebe93b834c5d06ad96c69f186480cc222957816f3664'}]}, 'timestamp': '2025-11-29 07:22:48.104680', '_unique_id': '749f5f3e5f864aec864758d728da0f65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.107 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.outgoing.bytes volume: 3418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7599b8c2-29ba-4f24-a8c1-e5bca3d5eba0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3418, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.107264', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34f57bd8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': 'c2bf1a0a3a1b26d28ea89b8d4fb656b8bf79ca77d1fdf4ffd8c1dbf6dd287f57'}]}, 'timestamp': '2025-11-29 07:22:48.107680', '_unique_id': '9b4df7db696b457490fddb61f4a9fef5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.109 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/memory.usage volume: 46.90234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c81dcfad-2bae-4ae8-84de-da5560d16d13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.90234375, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'timestamp': '2025-11-29T07:22:48.109706', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '34f5db5a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6431.038621281, 'message_signature': 'ee348477aac1122bcb235fdca97c33b87c43f86602ec3e754262ea8b167e6e32'}]}, 'timestamp': '2025-11-29 07:22:48.110114', '_unique_id': 'c32a26ab2ed540059945a3ec33cb480b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.119 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54a9ef7a-bd63-494b-9733-3bfd336c1d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.119110', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34f74cb0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': 'cc36c7c2139b54070714c8ab58b154b6297c6a8c06f7330b3ea287a6af4326a4'}]}, 'timestamp': '2025-11-29 07:22:48.119611', '_unique_id': '3fc25a4503f24d3dbfe265ac38148310'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.121 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.read.requests volume: 1111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.122 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dda0f45-fc72-4c49-8a4e-bd9a4165276c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1111, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-vda', 'timestamp': '2025-11-29T07:22:48.121777', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34f7b15a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'f54cbb08869c95522f28ca67b5d58530310a861953e53b3fbe30d3910d7bfad8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '88218c9c-e4a5-41da-887b-0a5b34b34417-sda', 'timestamp': '2025-11-29T07:22:48.121777', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'instance-00000072', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34f7be98-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.911805473, 'message_signature': 'ed54ba5695887ff2929a9d8bf7a86fd47adabc8c1747d224244899143eef1732'}]}, 'timestamp': '2025-11-29 07:22:48.122495', '_unique_id': '0fd22cdd82674000a5a37fe1b313e6ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.124 12 DEBUG ceilometer.compute.pollsters [-] 88218c9c-e4a5-41da-887b-0a5b34b34417/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '616a92af-c0d1-4e15-809a-e31a802d042c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000072-88218c9c-e4a5-41da-887b-0a5b34b34417-tap797bdee3-d7', 'timestamp': '2025-11-29T07:22:48.124578', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-996707846', 'name': 'tap797bdee3-d7', 'instance_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'instance_type': 'm1.nano', 'host': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:30:00:4c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap797bdee3-d7'}, 'message_id': '34f81f32-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6430.982103116, 'message_signature': 'e5f9448724be30048631416a8e44a464fc1fe6d884dac3dd47a88260b0a17873'}]}, 'timestamp': '2025-11-29 07:22:48.124965', '_unique_id': 'eae54df2c53b4fe2b421f04bd89c5627'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:22:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:22:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:22:48 np0005539504 nova_compute[187152]: 2025-11-29 07:22:48.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:50 np0005539504 nova_compute[187152]: 2025-11-29 07:22:50.980 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:53 np0005539504 nova_compute[187152]: 2025-11-29 07:22:53.660 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:53 np0005539504 podman[234918]: 2025-11-29 07:22:53.755401166 +0000 UTC m=+0.084370751 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:22:55 np0005539504 nova_compute[187152]: 2025-11-29 07:22:55.983 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.709 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.710 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.710 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.711 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.711 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.730 187156 INFO nova.compute.manager [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Terminating instance#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.747 187156 DEBUG nova.compute.manager [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:22:56 np0005539504 kernel: tap797bdee3-d7 (unregistering): left promiscuous mode
Nov 29 02:22:56 np0005539504 NetworkManager[55210]: <info>  [1764400976.7869] device (tap797bdee3-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:22:56 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:56Z|00422|binding|INFO|Releasing lport 797bdee3-d774-413a-bebc-e4e84a4055d9 from this chassis (sb_readonly=0)
Nov 29 02:22:56 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:56Z|00423|binding|INFO|Setting lport 797bdee3-d774-413a-bebc-e4e84a4055d9 down in Southbound
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:56 np0005539504 ovn_controller[95182]: 2025-11-29T07:22:56Z|00424|binding|INFO|Removing iface tap797bdee3-d7 ovn-installed in OVS
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.811 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:56.811 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:00:4c 10.100.0.5'], port_security=['fa:16:3e:30:00:4c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '88218c9c-e4a5-41da-887b-0a5b34b34417', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '6', 'neutron:security_group_ids': '25cb26de-7b16-455f-92fc-990d6e904a22', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b23a8aea-d144-4706-9c3e-bfbf05a7ea08, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=797bdee3-d774-413a-bebc-e4e84a4055d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:56.814 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 797bdee3-d774-413a-bebc-e4e84a4055d9 in datapath 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 unbound from our chassis#033[00m
Nov 29 02:22:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:56.816 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:22:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:56.819 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a60d1b33-a7f1-46a5-bcd6-ab9c4b4ce081]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:56.821 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 namespace which is not needed anymore#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.827 187156 DEBUG nova.compute.manager [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-changed-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.827 187156 DEBUG nova.compute.manager [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Refreshing instance network info cache due to event network-changed-797bdee3-d774-413a-bebc-e4e84a4055d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.828 187156 DEBUG oslo_concurrency.lockutils [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.828 187156 DEBUG oslo_concurrency.lockutils [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:22:56 np0005539504 nova_compute[187152]: 2025-11-29 07:22:56.828 187156 DEBUG nova.network.neutron [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Refreshing network info cache for port 797bdee3-d774-413a-bebc-e4e84a4055d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:22:56 np0005539504 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000072.scope: Deactivated successfully.
Nov 29 02:22:56 np0005539504 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000072.scope: Consumed 14.282s CPU time.
Nov 29 02:22:56 np0005539504 systemd-machined[153423]: Machine qemu-56-instance-00000072 terminated.
Nov 29 02:22:56 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [NOTICE]   (234751) : haproxy version is 2.8.14-c23fe91
Nov 29 02:22:56 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [NOTICE]   (234751) : path to executable is /usr/sbin/haproxy
Nov 29 02:22:56 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [WARNING]  (234751) : Exiting Master process...
Nov 29 02:22:56 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [WARNING]  (234751) : Exiting Master process...
Nov 29 02:22:56 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [ALERT]    (234751) : Current worker (234753) exited with code 143 (Terminated)
Nov 29 02:22:56 np0005539504 neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2[234740]: [WARNING]  (234751) : All workers exited. Exiting... (0)
Nov 29 02:22:56 np0005539504 systemd[1]: libpod-8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b.scope: Deactivated successfully.
Nov 29 02:22:56 np0005539504 podman[234961]: 2025-11-29 07:22:56.989470176 +0000 UTC m=+0.056847434 container died 8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 02:22:57 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9cd6dd9dc42dffbf39faaa216071198747663759defc95083961da8c778ea506-merged.mount: Deactivated successfully.
Nov 29 02:22:57 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b-userdata-shm.mount: Deactivated successfully.
Nov 29 02:22:57 np0005539504 nova_compute[187152]: 2025-11-29 07:22:57.029 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:57 np0005539504 nova_compute[187152]: 2025-11-29 07:22:57.029 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:57 np0005539504 podman[234961]: 2025-11-29 07:22:57.0310686 +0000 UTC m=+0.098445828 container cleanup 8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:22:57 np0005539504 nova_compute[187152]: 2025-11-29 07:22:57.033 187156 INFO nova.virt.libvirt.driver [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Instance destroyed successfully.#033[00m
Nov 29 02:22:57 np0005539504 nova_compute[187152]: 2025-11-29 07:22:57.033 187156 DEBUG nova.objects.instance [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 88218c9c-e4a5-41da-887b-0a5b34b34417 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:57 np0005539504 systemd[1]: libpod-conmon-8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b.scope: Deactivated successfully.
Nov 29 02:22:57 np0005539504 podman[235005]: 2025-11-29 07:22:57.119887339 +0000 UTC m=+0.070461189 container remove 8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.127 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[452177a1-6cc7-49e9-a4ac-d622d460cfdd]: (4, ('Sat Nov 29 07:22:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 (8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b)\n8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b\nSat Nov 29 07:22:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 (8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b)\n8d8a439431afb2612eeb47246310b93e78cbf538e225c304d14d57d5a719e73b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.128 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[43360728-9cb2-4aa6-a753-7835ff4afb54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.129 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d6d63fd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:57 np0005539504 nova_compute[187152]: 2025-11-29 07:22:57.131 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:57 np0005539504 kernel: tap8d6d63fd-d0: left promiscuous mode
Nov 29 02:22:57 np0005539504 nova_compute[187152]: 2025-11-29 07:22:57.150 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.154 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f774a9fa-dadf-4203-9d16-26c461512b62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.170 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[435050c5-d2b6-406e-8e03-725957bfa5eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.172 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8990ba50-b046-439d-a3a2-55a357fdb3ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.189 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e80e83-08b5-4a89-913d-e77eb3c2e135]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 640449, 'reachable_time': 21052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235024, 'error': None, 'target': 'ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:57 np0005539504 systemd[1]: run-netns-ovnmeta\x2d8d6d63fd\x2ddd9c\x2d49a1\x2dae26\x2d5b06d08155e2.mount: Deactivated successfully.
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.192 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8d6d63fd-dd9c-49a1-ae26-5b06d08155e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:22:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:22:57.193 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[b44cb8f6-21cd-4651-85bb-9190451a6156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:58 np0005539504 nova_compute[187152]: 2025-11-29 07:22:58.665 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.203 187156 DEBUG nova.virt.libvirt.vif [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-996707846',display_name='tempest-TestNetworkAdvancedServerOps-server-996707846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-996707846',id=114,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBErYDjaq5wSl2W+Prsdf52pa+BbyzQ8k9zEpFvcBMUmZof2lf5CEpW5zCB+o+jSzA6HcpcCjmGi63w6xdaVr0+PFvUpJaeSWrl18PWCMhc6ZJLP06Fdr+z+oANLaw/F/uQ==',key_name='tempest-TestNetworkAdvancedServerOps-1798919802',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-uj3m4a6l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:22:22Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=88218c9c-e4a5-41da-887b-0a5b34b34417,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.204 187156 DEBUG nova.network.os_vif_util [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.205 187156 DEBUG nova.network.os_vif_util [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.205 187156 DEBUG os_vif [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.206 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.207 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap797bdee3-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.210 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.212 187156 INFO os_vif [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:00:4c,bridge_name='br-int',has_traffic_filtering=True,id=797bdee3-d774-413a-bebc-e4e84a4055d9,network=Network(8d6d63fd-dd9c-49a1-ae26-5b06d08155e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap797bdee3-d7')#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.213 187156 INFO nova.virt.libvirt.driver [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Deleting instance files /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417_del#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.214 187156 INFO nova.virt.libvirt.driver [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Deletion of /var/lib/nova/instances/88218c9c-e4a5-41da-887b-0a5b34b34417_del complete#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.246 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.394 187156 INFO nova.compute.manager [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Took 3.65 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.395 187156 DEBUG oslo.service.loopingcall [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.395 187156 DEBUG nova.compute.manager [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.395 187156 DEBUG nova.network.neutron [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.409 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.409 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.417 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.417 187156 INFO nova.compute.claims [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.524 187156 DEBUG nova.compute.manager [req-cd0c589e-aea6-4294-91cd-7a49e84c3d0b req-1eae1f19-e658-4c6a-8aa8-ee6e716cb6f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-unplugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.525 187156 DEBUG oslo_concurrency.lockutils [req-cd0c589e-aea6-4294-91cd-7a49e84c3d0b req-1eae1f19-e658-4c6a-8aa8-ee6e716cb6f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.525 187156 DEBUG oslo_concurrency.lockutils [req-cd0c589e-aea6-4294-91cd-7a49e84c3d0b req-1eae1f19-e658-4c6a-8aa8-ee6e716cb6f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.525 187156 DEBUG oslo_concurrency.lockutils [req-cd0c589e-aea6-4294-91cd-7a49e84c3d0b req-1eae1f19-e658-4c6a-8aa8-ee6e716cb6f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.525 187156 DEBUG nova.compute.manager [req-cd0c589e-aea6-4294-91cd-7a49e84c3d0b req-1eae1f19-e658-4c6a-8aa8-ee6e716cb6f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-unplugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.525 187156 DEBUG nova.compute.manager [req-cd0c589e-aea6-4294-91cd-7a49e84c3d0b req-1eae1f19-e658-4c6a-8aa8-ee6e716cb6f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-unplugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.550 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.756 187156 DEBUG nova.compute.provider_tree [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.782 187156 DEBUG nova.scheduler.client.report [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.805 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.805 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.824 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.825 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.913 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.956 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.956 187156 DEBUG nova.network.neutron [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539504 nova_compute[187152]: 2025-11-29 07:23:00.999 187156 INFO nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.030 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.059 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.060 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.067 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.068 187156 INFO nova.compute.claims [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.222 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.223 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.224 187156 INFO nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Creating image(s)#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.225 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.225 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.226 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.244 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.292 187156 DEBUG nova.policy [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.312 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.314 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.315 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.331 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.394 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.395 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.434 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.436 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.437 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.507 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.508 187156 DEBUG nova.virt.disk.api [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.509 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.564 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.565 187156 DEBUG nova.virt.disk.api [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.566 187156 DEBUG nova.objects.instance [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 747a4028-af61-495f-9c7d-c5ac869967ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.607 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.607 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Ensure instance console log exists: /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.608 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.609 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.609 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.683 187156 DEBUG nova.compute.provider_tree [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.730 187156 DEBUG nova.scheduler.client.report [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:01 np0005539504 podman[235040]: 2025-11-29 07:23:01.732530342 +0000 UTC m=+0.069260745 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true)
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.813 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:01 np0005539504 nova_compute[187152]: 2025-11-29 07:23:01.813 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:23:02 np0005539504 nova_compute[187152]: 2025-11-29 07:23:02.462 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:23:02 np0005539504 nova_compute[187152]: 2025-11-29 07:23:02.463 187156 DEBUG nova.network.neutron [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:23:02 np0005539504 nova_compute[187152]: 2025-11-29 07:23:02.486 187156 INFO nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:23:02 np0005539504 nova_compute[187152]: 2025-11-29 07:23:02.508 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:23:03 np0005539504 nova_compute[187152]: 2025-11-29 07:23:03.341 187156 DEBUG nova.network.neutron [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updated VIF entry in instance network info cache for port 797bdee3-d774-413a-bebc-e4e84a4055d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:23:03 np0005539504 nova_compute[187152]: 2025-11-29 07:23:03.342 187156 DEBUG nova.network.neutron [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updating instance_info_cache with network_info: [{"id": "797bdee3-d774-413a-bebc-e4e84a4055d9", "address": "fa:16:3e:30:00:4c", "network": {"id": "8d6d63fd-dd9c-49a1-ae26-5b06d08155e2", "bridge": "br-int", "label": "tempest-network-smoke--1372618075", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap797bdee3-d7", "ovs_interfaceid": "797bdee3-d774-413a-bebc-e4e84a4055d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:03 np0005539504 nova_compute[187152]: 2025-11-29 07:23:03.375 187156 DEBUG nova.policy [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:23:04 np0005539504 nova_compute[187152]: 2025-11-29 07:23:04.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.799 187156 DEBUG nova.compute.manager [req-08f2b8d7-1725-4459-afe0-5b81ac8cc564 req-828f0b40-2a17-4e03-bad7-5adeb8da9309 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.800 187156 DEBUG oslo_concurrency.lockutils [req-08f2b8d7-1725-4459-afe0-5b81ac8cc564 req-828f0b40-2a17-4e03-bad7-5adeb8da9309 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.801 187156 DEBUG oslo_concurrency.lockutils [req-08f2b8d7-1725-4459-afe0-5b81ac8cc564 req-828f0b40-2a17-4e03-bad7-5adeb8da9309 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.801 187156 DEBUG oslo_concurrency.lockutils [req-08f2b8d7-1725-4459-afe0-5b81ac8cc564 req-828f0b40-2a17-4e03-bad7-5adeb8da9309 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.802 187156 DEBUG nova.compute.manager [req-08f2b8d7-1725-4459-afe0-5b81ac8cc564 req-828f0b40-2a17-4e03-bad7-5adeb8da9309 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] No waiting events found dispatching network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.802 187156 WARNING nova.compute.manager [req-08f2b8d7-1725-4459-afe0-5b81ac8cc564 req-828f0b40-2a17-4e03-bad7-5adeb8da9309 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received unexpected event network-vif-plugged-797bdee3-d774-413a-bebc-e4e84a4055d9 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.871 187156 DEBUG oslo_concurrency.lockutils [req-48a3d04e-d31c-48df-9c10-a9578c2c830a req-9c306b79-22bc-4d79-9461-45271ce46d77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-88218c9c-e4a5-41da-887b-0a5b34b34417" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.885 187156 DEBUG nova.network.neutron [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.927 187156 INFO nova.compute.manager [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Took 5.53 seconds to deallocate network for instance.#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.933 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.935 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.936 187156 INFO nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Creating image(s)#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.937 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.937 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.938 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.966 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.988 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:05 np0005539504 nova_compute[187152]: 2025-11-29 07:23:05.993 187156 DEBUG nova.compute.manager [req-7593d877-981e-4c8b-939e-98c453ac9fd6 req-a75f3656-baaf-4cc2-bf4e-bb2cbecec079 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Received event network-vif-deleted-797bdee3-d774-413a-bebc-e4e84a4055d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.028 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.028 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.029 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.046 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.074 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.075 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.101 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.102 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.138 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.140 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.141 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.198 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.199 187156 DEBUG nova.virt.disk.api [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Checking if we can resize image /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.200 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.259 187156 DEBUG nova.compute.provider_tree [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.267 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.269 187156 DEBUG nova.virt.disk.api [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Cannot resize image /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.269 187156 DEBUG nova.objects.instance [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'migration_context' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.300 187156 DEBUG nova.scheduler.client.report [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.308 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.309 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Ensure instance console log exists: /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.310 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.310 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.311 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.340 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.394 187156 INFO nova.scheduler.client.report [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 88218c9c-e4a5-41da-887b-0a5b34b34417#033[00m
Nov 29 02:23:06 np0005539504 nova_compute[187152]: 2025-11-29 07:23:06.584 187156 DEBUG oslo_concurrency.lockutils [None req-6b5a2e83-db0f-43de-a675-82562363d287 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "88218c9c-e4a5-41da-887b-0a5b34b34417" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:08 np0005539504 nova_compute[187152]: 2025-11-29 07:23:08.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:08 np0005539504 nova_compute[187152]: 2025-11-29 07:23:08.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:23:09 np0005539504 nova_compute[187152]: 2025-11-29 07:23:09.139 187156 DEBUG nova.network.neutron [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Successfully created port: 34be3fe8-368a-49e5-b6b6-2f650c642037 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:23:09 np0005539504 nova_compute[187152]: 2025-11-29 07:23:09.194 187156 DEBUG nova.network.neutron [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Successfully created port: 33409140-d169-4701-8e17-6eacddd88f23 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:23:10 np0005539504 nova_compute[187152]: 2025-11-29 07:23:10.211 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:10 np0005539504 nova_compute[187152]: 2025-11-29 07:23:10.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:10 np0005539504 nova_compute[187152]: 2025-11-29 07:23:10.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:10 np0005539504 nova_compute[187152]: 2025-11-29 07:23:10.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.024 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400977.0235672, 88218c9c-e4a5-41da-887b-0a5b34b34417 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.025 187156 INFO nova.compute.manager [-] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.364 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.365 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.366 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.366 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.368 187156 DEBUG nova.compute.manager [None req-ed6b4402-d5f3-4e3b-8d46-d3a59db681f4 - - - - - -] [instance: 88218c9c-e4a5-41da-887b-0a5b34b34417] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.559 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.560 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5710MB free_disk=73.19210815429688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.560 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.561 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.676 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 747a4028-af61-495f-9c7d-c5ac869967ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.676 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 22632f96-1108-42eb-a410-f31138f282ea actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.676 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.676 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.771 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.788 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.831 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:23:12 np0005539504 nova_compute[187152]: 2025-11-29 07:23:12.832 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.530 187156 DEBUG nova.network.neutron [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Successfully updated port: 34be3fe8-368a-49e5-b6b6-2f650c642037 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.569 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.569 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquired lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.569 187156 DEBUG nova.network.neutron [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.572 187156 DEBUG nova.network.neutron [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Successfully updated port: 33409140-d169-4701-8e17-6eacddd88f23 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.587 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.587 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.587 187156 DEBUG nova.network.neutron [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:23:13 np0005539504 podman[235080]: 2025-11-29 07:23:13.747071734 +0000 UTC m=+0.077115747 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:23:13 np0005539504 podman[235082]: 2025-11-29 07:23:13.753722512 +0000 UTC m=+0.078627867 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:23:13 np0005539504 podman[235081]: 2025-11-29 07:23:13.778280339 +0000 UTC m=+0.102423735 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.831 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.843 187156 DEBUG nova.network.neutron [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.850 187156 DEBUG nova.network.neutron [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:23:13 np0005539504 nova_compute[187152]: 2025-11-29 07:23:13.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:14 np0005539504 nova_compute[187152]: 2025-11-29 07:23:14.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:14 np0005539504 nova_compute[187152]: 2025-11-29 07:23:14.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:23:14 np0005539504 nova_compute[187152]: 2025-11-29 07:23:14.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:23:14 np0005539504 nova_compute[187152]: 2025-11-29 07:23:14.969 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:23:14 np0005539504 nova_compute[187152]: 2025-11-29 07:23:14.970 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:23:14 np0005539504 nova_compute[187152]: 2025-11-29 07:23:14.971 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:23:15 np0005539504 nova_compute[187152]: 2025-11-29 07:23:15.214 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:15 np0005539504 nova_compute[187152]: 2025-11-29 07:23:15.492 187156 DEBUG nova.compute.manager [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-changed-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:15 np0005539504 nova_compute[187152]: 2025-11-29 07:23:15.492 187156 DEBUG nova.compute.manager [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Refreshing instance network info cache due to event network-changed-34be3fe8-368a-49e5-b6b6-2f650c642037. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:23:15 np0005539504 nova_compute[187152]: 2025-11-29 07:23:15.493 187156 DEBUG oslo_concurrency.lockutils [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:15 np0005539504 nova_compute[187152]: 2025-11-29 07:23:15.493 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:15 np0005539504 nova_compute[187152]: 2025-11-29 07:23:15.730 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:16 np0005539504 nova_compute[187152]: 2025-11-29 07:23:16.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:17.320 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.320 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:17.323 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.435 187156 DEBUG nova.network.neutron [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Updating instance_info_cache with network_info: [{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.473 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Releasing lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.474 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance network_info: |[{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.474 187156 DEBUG oslo_concurrency.lockutils [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.474 187156 DEBUG nova.network.neutron [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Refreshing network info cache for port 34be3fe8-368a-49e5-b6b6-2f650c642037 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.477 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Start _get_guest_xml network_info=[{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.481 187156 WARNING nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.488 187156 DEBUG nova.virt.libvirt.host [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.489 187156 DEBUG nova.virt.libvirt.host [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.493 187156 DEBUG nova.virt.libvirt.host [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.493 187156 DEBUG nova.virt.libvirt.host [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.494 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.495 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.495 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.495 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.495 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.495 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.495 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.496 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.496 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.496 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.496 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.496 187156 DEBUG nova.virt.hardware [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.500 187156 DEBUG nova.virt.libvirt.vif [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:22:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-287106654',display_name='tempest-ServerRescueTestJSON-server-287106654',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-287106654',id=119,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ajt2ppqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:02Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=22632f96-1108-42eb-a410-f31138f282ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.500 187156 DEBUG nova.network.os_vif_util [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.501 187156 DEBUG nova.network.os_vif_util [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.501 187156 DEBUG nova.objects.instance [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'pci_devices' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.521 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <uuid>22632f96-1108-42eb-a410-f31138f282ea</uuid>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <name>instance-00000077</name>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerRescueTestJSON-server-287106654</nova:name>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:23:17</nova:creationTime>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:user uuid="a992c32ce5fb4cbab645023852f14adc">tempest-ServerRescueTestJSON-1854570869-project-member</nova:user>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:project uuid="980ddbfed54546c89c75e94503491a61">tempest-ServerRescueTestJSON-1854570869</nova:project>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        <nova:port uuid="34be3fe8-368a-49e5-b6b6-2f650c642037">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <entry name="serial">22632f96-1108-42eb-a410-f31138f282ea</entry>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <entry name="uuid">22632f96-1108-42eb-a410-f31138f282ea</entry>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:05:c5:bf"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <target dev="tap34be3fe8-36"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/console.log" append="off"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:23:17 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:23:17 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:23:17 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:23:17 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.522 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Preparing to wait for external event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.522 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.522 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.523 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.523 187156 DEBUG nova.virt.libvirt.vif [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:22:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-287106654',display_name='tempest-ServerRescueTestJSON-server-287106654',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-287106654',id=119,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ajt2ppqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:02Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=22632f96-1108-42eb-a410-f31138f282ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.524 187156 DEBUG nova.network.os_vif_util [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.524 187156 DEBUG nova.network.os_vif_util [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.525 187156 DEBUG os_vif [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.526 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.526 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.526 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.529 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.529 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34be3fe8-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.530 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap34be3fe8-36, col_values=(('external_ids', {'iface-id': '34be3fe8-368a-49e5-b6b6-2f650c642037', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:c5:bf', 'vm-uuid': '22632f96-1108-42eb-a410-f31138f282ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.531 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:17 np0005539504 NetworkManager[55210]: <info>  [1764400997.5326] manager: (tap34be3fe8-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.534 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.536 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.537 187156 INFO os_vif [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36')#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.626 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.626 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.626 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No VIF found with MAC fa:16:3e:05:c5:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:23:17 np0005539504 nova_compute[187152]: 2025-11-29 07:23:17.627 187156 INFO nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Using config drive#033[00m
Nov 29 02:23:17 np0005539504 podman[235141]: 2025-11-29 07:23:17.706250958 +0000 UTC m=+0.050447023 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:23:17 np0005539504 podman[235142]: 2025-11-29 07:23:17.744334508 +0000 UTC m=+0.084587557 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.350 187156 INFO nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Creating config drive at /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.355 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkrmm8enn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.410 187156 DEBUG nova.network.neutron [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updating instance_info_cache with network_info: [{"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.432 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.433 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Instance network_info: |[{"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.435 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Start _get_guest_xml network_info=[{"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.442 187156 WARNING nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.448 187156 DEBUG nova.virt.libvirt.host [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.448 187156 DEBUG nova.virt.libvirt.host [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.453 187156 DEBUG nova.virt.libvirt.host [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.453 187156 DEBUG nova.virt.libvirt.host [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.455 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.455 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.455 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.456 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.456 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.456 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.456 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.457 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.457 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.457 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.457 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.457 187156 DEBUG nova.virt.hardware [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.462 187156 DEBUG nova.virt.libvirt.vif [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-552673723',display_name='tempest-TestGettingAddress-server-552673723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-552673723',id=118,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqBtOVeWFxVzcYFJOuDJtYVuL20oDyqcRBPHq57GiuWFxaCS3KceqmhPXeIi9sFvrUoM3x5G9a+RY7U7UfyTQLwWhQmn8+j5tk7QGxgOZ6WpsSYFLeoEl1770NJZUoryw==',key_name='tempest-TestGettingAddress-2009457088',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-uo1v90l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:01Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=747a4028-af61-495f-9c7d-c5ac869967ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.462 187156 DEBUG nova.network.os_vif_util [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.463 187156 DEBUG nova.network.os_vif_util [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.464 187156 DEBUG nova.objects.instance [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 747a4028-af61-495f-9c7d-c5ac869967ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.485 187156 DEBUG oslo_concurrency.processutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkrmm8enn" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.493 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <uuid>747a4028-af61-495f-9c7d-c5ac869967ab</uuid>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <name>instance-00000076</name>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestGettingAddress-server-552673723</nova:name>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:23:19</nova:creationTime>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        <nova:port uuid="33409140-d169-4701-8e17-6eacddd88f23">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe3d:4d5e" ipVersion="6"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <entry name="serial">747a4028-af61-495f-9c7d-c5ac869967ab</entry>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <entry name="uuid">747a4028-af61-495f-9c7d-c5ac869967ab</entry>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.config"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:3d:4d:5e"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <target dev="tap33409140-d1"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/console.log" append="off"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:23:19 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:23:19 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:23:19 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:23:19 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.493 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Preparing to wait for external event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.493 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.494 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.494 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.495 187156 DEBUG nova.virt.libvirt.vif [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-552673723',display_name='tempest-TestGettingAddress-server-552673723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-552673723',id=118,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqBtOVeWFxVzcYFJOuDJtYVuL20oDyqcRBPHq57GiuWFxaCS3KceqmhPXeIi9sFvrUoM3x5G9a+RY7U7UfyTQLwWhQmn8+j5tk7QGxgOZ6WpsSYFLeoEl1770NJZUoryw==',key_name='tempest-TestGettingAddress-2009457088',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-uo1v90l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:01Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=747a4028-af61-495f-9c7d-c5ac869967ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.495 187156 DEBUG nova.network.os_vif_util [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.496 187156 DEBUG nova.network.os_vif_util [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.496 187156 DEBUG os_vif [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.497 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.497 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.498 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.500 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.501 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33409140-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.501 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap33409140-d1, col_values=(('external_ids', {'iface-id': '33409140-d169-4701-8e17-6eacddd88f23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:4d:5e', 'vm-uuid': '747a4028-af61-495f-9c7d-c5ac869967ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.503 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 NetworkManager[55210]: <info>  [1764400999.5041] manager: (tap33409140-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.506 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.514 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.515 187156 INFO os_vif [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1')#033[00m
Nov 29 02:23:19 np0005539504 kernel: tap34be3fe8-36: entered promiscuous mode
Nov 29 02:23:19 np0005539504 NetworkManager[55210]: <info>  [1764400999.5654] manager: (tap34be3fe8-36): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.569 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:19Z|00425|binding|INFO|Claiming lport 34be3fe8-368a-49e5-b6b6-2f650c642037 for this chassis.
Nov 29 02:23:19 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:19Z|00426|binding|INFO|34be3fe8-368a-49e5-b6b6-2f650c642037: Claiming fa:16:3e:05:c5:bf 10.100.0.2
Nov 29 02:23:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:19.587 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:c5:bf 10.100.0.2'], port_security=['fa:16:3e:05:c5:bf 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '22632f96-1108-42eb-a410-f31138f282ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=34be3fe8-368a-49e5-b6b6-2f650c642037) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:19.591 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 34be3fe8-368a-49e5-b6b6-2f650c642037 in datapath b58443a3-f575-4ff1-951d-e92781861793 bound to our chassis#033[00m
Nov 29 02:23:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:19.592 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:23:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:19.594 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a027e6-93e5-4274-a653-bdd22ba753d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.597 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.598 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.598 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:3d:4d:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.599 187156 INFO nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Using config drive#033[00m
Nov 29 02:23:19 np0005539504 systemd-udevd[235209]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:23:19 np0005539504 systemd-machined[153423]: New machine qemu-57-instance-00000077.
Nov 29 02:23:19 np0005539504 NetworkManager[55210]: <info>  [1764400999.6222] device (tap34be3fe8-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:23:19 np0005539504 NetworkManager[55210]: <info>  [1764400999.6234] device (tap34be3fe8-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.632 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 systemd[1]: Started Virtual Machine qemu-57-instance-00000077.
Nov 29 02:23:19 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:19Z|00427|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 ovn-installed in OVS
Nov 29 02:23:19 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:19Z|00428|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 up in Southbound
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.642 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.849 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400999.8487833, 22632f96-1108-42eb-a410-f31138f282ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.849 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.881 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.886 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764400999.8490014, 22632f96-1108-42eb-a410-f31138f282ea => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.886 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.909 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.913 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:19 np0005539504 nova_compute[187152]: 2025-11-29 07:23:19.934 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.872 187156 INFO nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Creating config drive at /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.config#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.883 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp250kw82k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.973 187156 DEBUG nova.compute.manager [req-8bb6c1e9-0480-4995-81b8-73f768102ebf req-58bcff62-8792-4a5a-890d-9b6ca9176250 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.973 187156 DEBUG oslo_concurrency.lockutils [req-8bb6c1e9-0480-4995-81b8-73f768102ebf req-58bcff62-8792-4a5a-890d-9b6ca9176250 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.974 187156 DEBUG oslo_concurrency.lockutils [req-8bb6c1e9-0480-4995-81b8-73f768102ebf req-58bcff62-8792-4a5a-890d-9b6ca9176250 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.974 187156 DEBUG oslo_concurrency.lockutils [req-8bb6c1e9-0480-4995-81b8-73f768102ebf req-58bcff62-8792-4a5a-890d-9b6ca9176250 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.974 187156 DEBUG nova.compute.manager [req-8bb6c1e9-0480-4995-81b8-73f768102ebf req-58bcff62-8792-4a5a-890d-9b6ca9176250 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Processing event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.975 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.980 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401000.9802449, 22632f96-1108-42eb-a410-f31138f282ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.981 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.984 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.988 187156 INFO nova.virt.libvirt.driver [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance spawned successfully.#033[00m
Nov 29 02:23:20 np0005539504 nova_compute[187152]: 2025-11-29 07:23:20.989 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.020 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.028 187156 DEBUG oslo_concurrency.processutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp250kw82k" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.038 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.043 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.043 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.044 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.044 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.044 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.045 187156 DEBUG nova.virt.libvirt.driver [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.083 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:23:21 np0005539504 kernel: tap33409140-d1: entered promiscuous mode
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.1054] manager: (tap33409140-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Nov 29 02:23:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:21Z|00429|binding|INFO|Claiming lport 33409140-d169-4701-8e17-6eacddd88f23 for this chassis.
Nov 29 02:23:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:21Z|00430|binding|INFO|33409140-d169-4701-8e17-6eacddd88f23: Claiming fa:16:3e:3d:4d:5e 10.100.0.11 2001:db8::f816:3eff:fe3d:4d5e
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.109 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.116 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.1199] device (tap33409140-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.1206] device (tap33409140-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.1239] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.1243] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.125 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.133 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:4d:5e 10.100.0.11 2001:db8::f816:3eff:fe3d:4d5e'], port_security=['fa:16:3e:3d:4d:5e 10.100.0.11 2001:db8::f816:3eff:fe3d:4d5e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8::f816:3eff:fe3d:4d5e/64', 'neutron:device_id': '747a4028-af61-495f-9c7d-c5ac869967ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '17fd93d9-fafe-4a7d-9c01-ce54fbe8f760', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=33409140-d169-4701-8e17-6eacddd88f23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.135 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 33409140-d169-4701-8e17-6eacddd88f23 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 bound to our chassis#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.131 187156 INFO nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Took 15.20 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.136 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f75dc671-4e0c-40f1-8afd-c16b5e416d95#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.137 187156 DEBUG nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.149 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3f631ee2-4b06-4635-9b74-adad9dd6430b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.151 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf75dc671-41 in ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:23:21 np0005539504 systemd-machined[153423]: New machine qemu-58-instance-00000076.
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.153 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf75dc671-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.153 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[21748474-0014-46e3-8650-c2fcf8e00949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.154 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dd98f659-7fea-4b99-bc60-a5baefc2de57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.167 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[adba2ac8-b164-4330-a261-6d64ccb2e65d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 systemd[1]: Started Virtual Machine qemu-58-instance-00000076.
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.199 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b36728db-380c-41b2-8353-3839e6942588]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.231 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[9c8f54c2-b30b-41fb-ae56-72018d6cc66c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.2518] manager: (tapf75dc671-40): new Veth device (/org/freedesktop/NetworkManager/Devices/202)
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.250 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5acd5fb1-55b0-4995-8841-feac473b06e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.287 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c507c05f-680b-4deb-b753-f0371efa9051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.290 187156 INFO nova.compute.manager [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Took 20.27 seconds to build instance.#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.291 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[23753e08-5574-4310-95fe-ef534f0767be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.295 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:21Z|00431|binding|INFO|Setting lport 33409140-d169-4701-8e17-6eacddd88f23 ovn-installed in OVS
Nov 29 02:23:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:21Z|00432|binding|INFO|Setting lport 33409140-d169-4701-8e17-6eacddd88f23 up in Southbound
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.304 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.3181] device (tapf75dc671-40): carrier: link connected
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.323 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[44f7a56f-2f1d-4a99-b9d5-c505825fb531]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.332 187156 DEBUG oslo_concurrency.lockutils [None req-8d7141d2-5ebd-4000-ad69-f4566882e2e5 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.344 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce84cbf5-0587-4e0d-bd6e-1f34f38bc410]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf75dc671-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:9b:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646419, 'reachable_time': 39400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235280, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.358 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f285c5-315f-4016-bbab-da1da336620d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:9b10'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646419, 'tstamp': 646419}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235281, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.374 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a3295abc-9924-48a6-9ef8-7f4c26918a17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf75dc671-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:9b:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646419, 'reachable_time': 39400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235282, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.405 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cc191524-d2ac-4ee2-8dc4-a18f40bf1915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.458 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4cb444-a6a4-4e10-9d35-4a98bd5a1cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.460 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf75dc671-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.461 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.461 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf75dc671-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.463 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 NetworkManager[55210]: <info>  [1764401001.4641] manager: (tapf75dc671-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Nov 29 02:23:21 np0005539504 kernel: tapf75dc671-40: entered promiscuous mode
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.466 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.467 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf75dc671-40, col_values=(('external_ids', {'iface-id': '6897d2ce-b04d-4d85-9bb6-9da51e7d7f20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.469 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:21Z|00433|binding|INFO|Releasing lport 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 from this chassis (sb_readonly=0)
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.469 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.470 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f75dc671-4e0c-40f1-8afd-c16b5e416d95.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f75dc671-4e0c-40f1-8afd-c16b5e416d95.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.472 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9566f7-de90-4c3b-a81d-fd521f753249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.473 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-f75dc671-4e0c-40f1-8afd-c16b5e416d95
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/f75dc671-4e0c-40f1-8afd-c16b5e416d95.pid.haproxy
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID f75dc671-4e0c-40f1-8afd-c16b5e416d95
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:23:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:21.475 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'env', 'PROCESS_TAG=haproxy-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f75dc671-4e0c-40f1-8afd-c16b5e416d95.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.481 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.730 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401001.7302718, 747a4028-af61-495f-9c7d-c5ac869967ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.731 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.751 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.757 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401001.731589, 747a4028-af61-495f-9c7d-c5ac869967ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.757 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.779 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.784 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:21 np0005539504 nova_compute[187152]: 2025-11-29 07:23:21.804 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:23:21 np0005539504 podman[235321]: 2025-11-29 07:23:21.883173905 +0000 UTC m=+0.046475106 container create 08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:23:21 np0005539504 systemd[1]: Started libpod-conmon-08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4.scope.
Nov 29 02:23:21 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:23:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02b12b5b4809a207a01a68a74b938f84c7ee54c7b9137ca8374495c83e1d62f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:23:21 np0005539504 podman[235321]: 2025-11-29 07:23:21.858655338 +0000 UTC m=+0.021956569 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:23:21 np0005539504 podman[235321]: 2025-11-29 07:23:21.971272386 +0000 UTC m=+0.134573617 container init 08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:23:21 np0005539504 podman[235321]: 2025-11-29 07:23:21.978925 +0000 UTC m=+0.142226201 container start 08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:23:22 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [NOTICE]   (235341) : New worker (235343) forked
Nov 29 02:23:22 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [NOTICE]   (235341) : Loading success.
Nov 29 02:23:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:22.962 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:22.965 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:22.966 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.103 187156 DEBUG nova.compute.manager [req-3cd36c0b-b478-4dbd-a670-a6eb8058a548 req-ff889021-21e8-4928-bdc2-bce0adaca30e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.103 187156 DEBUG oslo_concurrency.lockutils [req-3cd36c0b-b478-4dbd-a670-a6eb8058a548 req-ff889021-21e8-4928-bdc2-bce0adaca30e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.104 187156 DEBUG oslo_concurrency.lockutils [req-3cd36c0b-b478-4dbd-a670-a6eb8058a548 req-ff889021-21e8-4928-bdc2-bce0adaca30e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.104 187156 DEBUG oslo_concurrency.lockutils [req-3cd36c0b-b478-4dbd-a670-a6eb8058a548 req-ff889021-21e8-4928-bdc2-bce0adaca30e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.105 187156 DEBUG nova.compute.manager [req-3cd36c0b-b478-4dbd-a670-a6eb8058a548 req-ff889021-21e8-4928-bdc2-bce0adaca30e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.105 187156 WARNING nova.compute.manager [req-3cd36c0b-b478-4dbd-a670-a6eb8058a548 req-ff889021-21e8-4928-bdc2-bce0adaca30e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.600 187156 DEBUG nova.network.neutron [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Updated VIF entry in instance network info cache for port 34be3fe8-368a-49e5-b6b6-2f650c642037. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.601 187156 DEBUG nova.network.neutron [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Updating instance_info_cache with network_info: [{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.605 187156 DEBUG nova.compute.manager [req-cda91c47-26b3-47f1-8f49-21ca61a362d0 req-9a01a813-141a-45ca-8c11-921045c90cca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.606 187156 DEBUG oslo_concurrency.lockutils [req-cda91c47-26b3-47f1-8f49-21ca61a362d0 req-9a01a813-141a-45ca-8c11-921045c90cca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.606 187156 DEBUG oslo_concurrency.lockutils [req-cda91c47-26b3-47f1-8f49-21ca61a362d0 req-9a01a813-141a-45ca-8c11-921045c90cca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.606 187156 DEBUG oslo_concurrency.lockutils [req-cda91c47-26b3-47f1-8f49-21ca61a362d0 req-9a01a813-141a-45ca-8c11-921045c90cca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.607 187156 DEBUG nova.compute.manager [req-cda91c47-26b3-47f1-8f49-21ca61a362d0 req-9a01a813-141a-45ca-8c11-921045c90cca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Processing event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.608 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.614 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401003.613976, 747a4028-af61-495f-9c7d-c5ac869967ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.615 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.617 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.620 187156 DEBUG oslo_concurrency.lockutils [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.621 187156 DEBUG nova.compute.manager [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-changed-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.621 187156 DEBUG nova.compute.manager [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Refreshing instance network info cache due to event network-changed-33409140-d169-4701-8e17-6eacddd88f23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.621 187156 DEBUG oslo_concurrency.lockutils [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.622 187156 DEBUG oslo_concurrency.lockutils [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.622 187156 DEBUG nova.network.neutron [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Refreshing network info cache for port 33409140-d169-4701-8e17-6eacddd88f23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.628 187156 INFO nova.virt.libvirt.driver [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Instance spawned successfully.#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.629 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.643 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.651 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.662 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.662 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.663 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.663 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.664 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.664 187156 DEBUG nova.virt.libvirt.driver [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.671 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.804 187156 INFO nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Took 22.58 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.804 187156 DEBUG nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.915 187156 INFO nova.compute.manager [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Took 23.56 seconds to build instance.#033[00m
Nov 29 02:23:23 np0005539504 nova_compute[187152]: 2025-11-29 07:23:23.939 187156 DEBUG oslo_concurrency.lockutils [None req-e8bbdec8-0be2-4bf2-8a4b-d912c1c76636 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:24 np0005539504 nova_compute[187152]: 2025-11-29 07:23:24.505 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:24 np0005539504 podman[235352]: 2025-11-29 07:23:24.738236976 +0000 UTC m=+0.075234737 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:23:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:25.326 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.690 187156 INFO nova.compute.manager [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Rescuing#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.691 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.691 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquired lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.691 187156 DEBUG nova.network.neutron [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.703 187156 DEBUG nova.compute.manager [req-6f117722-d883-4e64-91e4-1df48d2a80c4 req-017d74a4-68f7-447a-84a6-01b5f2a097f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.703 187156 DEBUG oslo_concurrency.lockutils [req-6f117722-d883-4e64-91e4-1df48d2a80c4 req-017d74a4-68f7-447a-84a6-01b5f2a097f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.704 187156 DEBUG oslo_concurrency.lockutils [req-6f117722-d883-4e64-91e4-1df48d2a80c4 req-017d74a4-68f7-447a-84a6-01b5f2a097f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.704 187156 DEBUG oslo_concurrency.lockutils [req-6f117722-d883-4e64-91e4-1df48d2a80c4 req-017d74a4-68f7-447a-84a6-01b5f2a097f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.704 187156 DEBUG nova.compute.manager [req-6f117722-d883-4e64-91e4-1df48d2a80c4 req-017d74a4-68f7-447a-84a6-01b5f2a097f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] No waiting events found dispatching network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:25 np0005539504 nova_compute[187152]: 2025-11-29 07:23:25.705 187156 WARNING nova.compute.manager [req-6f117722-d883-4e64-91e4-1df48d2a80c4 req-017d74a4-68f7-447a-84a6-01b5f2a097f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received unexpected event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:23:26 np0005539504 nova_compute[187152]: 2025-11-29 07:23:26.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:29 np0005539504 nova_compute[187152]: 2025-11-29 07:23:29.510 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:31 np0005539504 nova_compute[187152]: 2025-11-29 07:23:31.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:32 np0005539504 podman[235387]: 2025-11-29 07:23:32.737522929 +0000 UTC m=+0.074509106 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:23:33 np0005539504 nova_compute[187152]: 2025-11-29 07:23:33.475 187156 DEBUG nova.network.neutron [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updated VIF entry in instance network info cache for port 33409140-d169-4701-8e17-6eacddd88f23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:23:33 np0005539504 nova_compute[187152]: 2025-11-29 07:23:33.476 187156 DEBUG nova.network.neutron [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updating instance_info_cache with network_info: [{"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:33 np0005539504 nova_compute[187152]: 2025-11-29 07:23:33.514 187156 DEBUG oslo_concurrency.lockutils [req-7d5aa2a0-2272-4355-b670-1c9b7b4d82ae req-adcbbc98-dff8-457c-8af1-b8d880ce1418 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:34 np0005539504 nova_compute[187152]: 2025-11-29 07:23:34.351 187156 DEBUG nova.network.neutron [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Updating instance_info_cache with network_info: [{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:34 np0005539504 nova_compute[187152]: 2025-11-29 07:23:34.393 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Releasing lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:34 np0005539504 nova_compute[187152]: 2025-11-29 07:23:34.523 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:34 np0005539504 nova_compute[187152]: 2025-11-29 07:23:34.911 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:23:35 np0005539504 nova_compute[187152]: 2025-11-29 07:23:35.953 187156 DEBUG nova.compute.manager [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-changed-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:35 np0005539504 nova_compute[187152]: 2025-11-29 07:23:35.954 187156 DEBUG nova.compute.manager [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Refreshing instance network info cache due to event network-changed-33409140-d169-4701-8e17-6eacddd88f23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:23:35 np0005539504 nova_compute[187152]: 2025-11-29 07:23:35.954 187156 DEBUG oslo_concurrency.lockutils [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:35 np0005539504 nova_compute[187152]: 2025-11-29 07:23:35.955 187156 DEBUG oslo_concurrency.lockutils [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:35 np0005539504 nova_compute[187152]: 2025-11-29 07:23:35.955 187156 DEBUG nova.network.neutron [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Refreshing network info cache for port 33409140-d169-4701-8e17-6eacddd88f23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.052 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.084 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 747a4028-af61-495f-9c7d-c5ac869967ab _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.084 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 22632f96-1108-42eb-a410-f31138f282ea _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.084 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.084 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "747a4028-af61-495f-9c7d-c5ac869967ab" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.085 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.085 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "22632f96-1108-42eb-a410-f31138f282ea" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.085 187156 INFO nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.085 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "22632f96-1108-42eb-a410-f31138f282ea" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:36 np0005539504 nova_compute[187152]: 2025-11-29 07:23:36.113 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "747a4028-af61-495f-9c7d-c5ac869967ab" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:36Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:4d:5e 10.100.0.11
Nov 29 02:23:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:36Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:4d:5e 10.100.0.11
Nov 29 02:23:37 np0005539504 kernel: tap34be3fe8-36 (unregistering): left promiscuous mode
Nov 29 02:23:37 np0005539504 NetworkManager[55210]: <info>  [1764401017.0863] device (tap34be3fe8-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:23:37 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:37Z|00434|binding|INFO|Releasing lport 34be3fe8-368a-49e5-b6b6-2f650c642037 from this chassis (sb_readonly=0)
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.095 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:37 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:37Z|00435|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 down in Southbound
Nov 29 02:23:37 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:37Z|00436|binding|INFO|Removing iface tap34be3fe8-36 ovn-installed in OVS
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.098 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:37.106 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:c5:bf 10.100.0.2'], port_security=['fa:16:3e:05:c5:bf 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '22632f96-1108-42eb-a410-f31138f282ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=34be3fe8-368a-49e5-b6b6-2f650c642037) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:37.108 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 34be3fe8-368a-49e5-b6b6-2f650c642037 in datapath b58443a3-f575-4ff1-951d-e92781861793 unbound from our chassis#033[00m
Nov 29 02:23:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:37.110 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:23:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:37.112 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b9f79a-c735-49fb-b029-1ab3783352cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.113 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:37 np0005539504 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000077.scope: Deactivated successfully.
Nov 29 02:23:37 np0005539504 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000077.scope: Consumed 12.969s CPU time.
Nov 29 02:23:37 np0005539504 systemd-machined[153423]: Machine qemu-57-instance-00000077 terminated.
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.390 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.932 187156 INFO nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.940 187156 INFO nova.virt.libvirt.driver [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance destroyed successfully.#033[00m
Nov 29 02:23:37 np0005539504 nova_compute[187152]: 2025-11-29 07:23:37.941 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'numa_topology' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.225 187156 DEBUG nova.compute.manager [req-399ecb64-6ba6-45fa-a17c-c7e43ec5bcb9 req-24c498ef-13ff-4e5a-be17-f46bbdec7091 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.225 187156 DEBUG oslo_concurrency.lockutils [req-399ecb64-6ba6-45fa-a17c-c7e43ec5bcb9 req-24c498ef-13ff-4e5a-be17-f46bbdec7091 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.226 187156 DEBUG oslo_concurrency.lockutils [req-399ecb64-6ba6-45fa-a17c-c7e43ec5bcb9 req-24c498ef-13ff-4e5a-be17-f46bbdec7091 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.226 187156 DEBUG oslo_concurrency.lockutils [req-399ecb64-6ba6-45fa-a17c-c7e43ec5bcb9 req-24c498ef-13ff-4e5a-be17-f46bbdec7091 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.226 187156 DEBUG nova.compute.manager [req-399ecb64-6ba6-45fa-a17c-c7e43ec5bcb9 req-24c498ef-13ff-4e5a-be17-f46bbdec7091 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.226 187156 WARNING nova.compute.manager [req-399ecb64-6ba6-45fa-a17c-c7e43ec5bcb9 req-24c498ef-13ff-4e5a-be17-f46bbdec7091 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.296 187156 INFO nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Attempting rescue#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.297 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.301 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.302 187156 INFO nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Creating image(s)#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.303 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.303 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.304 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.304 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.380 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.381 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.401 187156 DEBUG oslo_concurrency.processutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.476 187156 DEBUG oslo_concurrency.processutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.477 187156 DEBUG oslo_concurrency.processutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.631 187156 DEBUG oslo_concurrency.processutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.rescue" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.632 187156 DEBUG oslo_concurrency.lockutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.633 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'migration_context' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.658 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.659 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Start _get_guest_xml network_info=[{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1930314854-network", "vif_mac": "fa:16:3e:05:c5:bf"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.659 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'resources' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.683 187156 WARNING nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.693 187156 DEBUG nova.virt.libvirt.host [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.694 187156 DEBUG nova.virt.libvirt.host [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.697 187156 DEBUG nova.virt.libvirt.host [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.698 187156 DEBUG nova.virt.libvirt.host [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.699 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.700 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.700 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.701 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.701 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.701 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.702 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.702 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.702 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.703 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.703 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.703 187156 DEBUG nova.virt.hardware [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.703 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.721 187156 DEBUG nova.virt.libvirt.vif [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:22:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-287106654',display_name='tempest-ServerRescueTestJSON-server-287106654',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-287106654',id=119,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ajt2ppqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:21Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=22632f96-1108-42eb-a410-f31138f282ea,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1930314854-network", "vif_mac": "fa:16:3e:05:c5:bf"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.722 187156 DEBUG nova.network.os_vif_util [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1930314854-network", "vif_mac": "fa:16:3e:05:c5:bf"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.723 187156 DEBUG nova.network.os_vif_util [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.724 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'pci_devices' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.743 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <uuid>22632f96-1108-42eb-a410-f31138f282ea</uuid>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <name>instance-00000077</name>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerRescueTestJSON-server-287106654</nova:name>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:23:38</nova:creationTime>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:user uuid="a992c32ce5fb4cbab645023852f14adc">tempest-ServerRescueTestJSON-1854570869-project-member</nova:user>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:project uuid="980ddbfed54546c89c75e94503491a61">tempest-ServerRescueTestJSON-1854570869</nova:project>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        <nova:port uuid="34be3fe8-368a-49e5-b6b6-2f650c642037">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <entry name="serial">22632f96-1108-42eb-a410-f31138f282ea</entry>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <entry name="uuid">22632f96-1108-42eb-a410-f31138f282ea</entry>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.rescue"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <target dev="vdb" bus="virtio"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config.rescue"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:05:c5:bf"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <target dev="tap34be3fe8-36"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/console.log" append="off"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:23:38 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:23:38 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:23:38 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:23:38 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.751 187156 INFO nova.virt.libvirt.driver [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance destroyed successfully.#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.819 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.819 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.819 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.820 187156 DEBUG nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No VIF found with MAC fa:16:3e:05:c5:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.820 187156 INFO nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Using config drive#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.843 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:38 np0005539504 nova_compute[187152]: 2025-11-29 07:23:38.880 187156 DEBUG nova.objects.instance [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'keypairs' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.279 187156 INFO nova.virt.libvirt.driver [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Creating config drive at /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config.rescue#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.285 187156 DEBUG oslo_concurrency.processutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnz9gbwep execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.414 187156 DEBUG oslo_concurrency.processutils [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnz9gbwep" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:39 np0005539504 kernel: tap34be3fe8-36: entered promiscuous mode
Nov 29 02:23:39 np0005539504 systemd-udevd[235434]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:23:39 np0005539504 NetworkManager[55210]: <info>  [1764401019.4729] manager: (tap34be3fe8-36): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Nov 29 02:23:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:39Z|00437|binding|INFO|Claiming lport 34be3fe8-368a-49e5-b6b6-2f650c642037 for this chassis.
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.474 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:39Z|00438|binding|INFO|34be3fe8-368a-49e5-b6b6-2f650c642037: Claiming fa:16:3e:05:c5:bf 10.100.0.2
Nov 29 02:23:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:39.486 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:c5:bf 10.100.0.2'], port_security=['fa:16:3e:05:c5:bf 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '22632f96-1108-42eb-a410-f31138f282ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '5', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=34be3fe8-368a-49e5-b6b6-2f650c642037) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:39Z|00439|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 up in Southbound
Nov 29 02:23:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:39.487 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 34be3fe8-368a-49e5-b6b6-2f650c642037 in datapath b58443a3-f575-4ff1-951d-e92781861793 bound to our chassis#033[00m
Nov 29 02:23:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:39.488 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.489 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:39.488 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7b412118-08b6-4739-b28b-d4110911035a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:39Z|00440|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 ovn-installed in OVS
Nov 29 02:23:39 np0005539504 NetworkManager[55210]: <info>  [1764401019.4912] device (tap34be3fe8-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:39 np0005539504 NetworkManager[55210]: <info>  [1764401019.4927] device (tap34be3fe8-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:23:39 np0005539504 systemd-machined[153423]: New machine qemu-59-instance-00000077.
Nov 29 02:23:39 np0005539504 systemd[1]: Started Virtual Machine qemu-59-instance-00000077.
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.525 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.964 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 22632f96-1108-42eb-a410-f31138f282ea due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.965 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401019.964535, 22632f96-1108-42eb-a410-f31138f282ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.965 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.982 187156 DEBUG nova.compute.manager [None req-672a16c6-d5d5-420a-9647-43477810ffcd a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.989 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:39 np0005539504 nova_compute[187152]: 2025-11-29 07:23:39.992 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.054 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.055 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401019.9677541, 22632f96-1108-42eb-a410-f31138f282ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.055 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.097 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.101 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.245 187156 DEBUG nova.network.neutron [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updated VIF entry in instance network info cache for port 33409140-d169-4701-8e17-6eacddd88f23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.246 187156 DEBUG nova.network.neutron [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updating instance_info_cache with network_info: [{"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.268 187156 DEBUG oslo_concurrency.lockutils [req-11baedce-9a5b-45eb-ade6-e52d7e5a600b req-991ce3cd-f4eb-4add-bde0-6ad0ba6f02bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.352 187156 DEBUG nova.compute.manager [req-6941bf0d-37de-47da-b3bb-ddf4daa3f0e6 req-0e1e5964-5853-465c-b91d-18a1c46759ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.353 187156 DEBUG oslo_concurrency.lockutils [req-6941bf0d-37de-47da-b3bb-ddf4daa3f0e6 req-0e1e5964-5853-465c-b91d-18a1c46759ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.353 187156 DEBUG oslo_concurrency.lockutils [req-6941bf0d-37de-47da-b3bb-ddf4daa3f0e6 req-0e1e5964-5853-465c-b91d-18a1c46759ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.353 187156 DEBUG oslo_concurrency.lockutils [req-6941bf0d-37de-47da-b3bb-ddf4daa3f0e6 req-0e1e5964-5853-465c-b91d-18a1c46759ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.353 187156 DEBUG nova.compute.manager [req-6941bf0d-37de-47da-b3bb-ddf4daa3f0e6 req-0e1e5964-5853-465c-b91d-18a1c46759ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.354 187156 WARNING nova.compute.manager [req-6941bf0d-37de-47da-b3bb-ddf4daa3f0e6 req-0e1e5964-5853-465c-b91d-18a1c46759ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state rescued and task_state None.#033[00m
Nov 29 02:23:40 np0005539504 nova_compute[187152]: 2025-11-29 07:23:40.436 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:41 np0005539504 nova_compute[187152]: 2025-11-29 07:23:41.087 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.148 187156 INFO nova.compute.manager [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Unrescuing#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.151 187156 DEBUG oslo_concurrency.lockutils [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.151 187156 DEBUG oslo_concurrency.lockutils [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquired lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.151 187156 DEBUG nova.network.neutron [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.464 187156 DEBUG nova.compute.manager [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.465 187156 DEBUG oslo_concurrency.lockutils [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.465 187156 DEBUG oslo_concurrency.lockutils [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.465 187156 DEBUG oslo_concurrency.lockutils [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.466 187156 DEBUG nova.compute.manager [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.466 187156 WARNING nova.compute.manager [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.466 187156 DEBUG nova.compute.manager [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.466 187156 DEBUG oslo_concurrency.lockutils [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.466 187156 DEBUG oslo_concurrency.lockutils [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.467 187156 DEBUG oslo_concurrency.lockutils [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.467 187156 DEBUG nova.compute.manager [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:42 np0005539504 nova_compute[187152]: 2025-11-29 07:23:42.467 187156 WARNING nova.compute.manager [req-799fa89e-af94-4719-bd28-b1bd2f414c58 req-5367d47e-fecf-4836-b526-a44b8e682ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.570 187156 DEBUG nova.network.neutron [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Updating instance_info_cache with network_info: [{"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.654 187156 DEBUG oslo_concurrency.lockutils [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Releasing lock "refresh_cache-22632f96-1108-42eb-a410-f31138f282ea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.655 187156 DEBUG nova.objects.instance [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'flavor' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:43 np0005539504 kernel: tap34be3fe8-36 (unregistering): left promiscuous mode
Nov 29 02:23:43 np0005539504 NetworkManager[55210]: <info>  [1764401023.7043] device (tap34be3fe8-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:23:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:43Z|00441|binding|INFO|Releasing lport 34be3fe8-368a-49e5-b6b6-2f650c642037 from this chassis (sb_readonly=0)
Nov 29 02:23:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:43Z|00442|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 down in Southbound
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.713 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:43Z|00443|binding|INFO|Removing iface tap34be3fe8-36 ovn-installed in OVS
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.715 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.726 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:43 np0005539504 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000077.scope: Deactivated successfully.
Nov 29 02:23:43 np0005539504 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000077.scope: Consumed 4.218s CPU time.
Nov 29 02:23:43 np0005539504 systemd-machined[153423]: Machine qemu-59-instance-00000077 terminated.
Nov 29 02:23:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:43.855 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:c5:bf 10.100.0.2'], port_security=['fa:16:3e:05:c5:bf 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '22632f96-1108-42eb-a410-f31138f282ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '6', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=34be3fe8-368a-49e5-b6b6-2f650c642037) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:43.857 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 34be3fe8-368a-49e5-b6b6-2f650c642037 in datapath b58443a3-f575-4ff1-951d-e92781861793 unbound from our chassis#033[00m
Nov 29 02:23:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:43.858 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:23:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:43.859 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d393d191-665b-4f95-b310-62bb17ff58d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:43 np0005539504 podman[235502]: 2025-11-29 07:23:43.86298813 +0000 UTC m=+0.057867331 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:23:43 np0005539504 podman[235505]: 2025-11-29 07:23:43.871717044 +0000 UTC m=+0.066220985 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:23:43 np0005539504 podman[235504]: 2025-11-29 07:23:43.899871949 +0000 UTC m=+0.093317531 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.984 187156 INFO nova.virt.libvirt.driver [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance destroyed successfully.#033[00m
Nov 29 02:23:43 np0005539504 nova_compute[187152]: 2025-11-29 07:23:43.985 187156 DEBUG nova.objects.instance [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'numa_topology' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:44 np0005539504 kernel: tap34be3fe8-36: entered promiscuous mode
Nov 29 02:23:44 np0005539504 systemd-udevd[235497]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:23:44 np0005539504 NetworkManager[55210]: <info>  [1764401024.2017] manager: (tap34be3fe8-36): new Tun device (/org/freedesktop/NetworkManager/Devices/205)
Nov 29 02:23:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:44Z|00444|binding|INFO|Claiming lport 34be3fe8-368a-49e5-b6b6-2f650c642037 for this chassis.
Nov 29 02:23:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:44Z|00445|binding|INFO|34be3fe8-368a-49e5-b6b6-2f650c642037: Claiming fa:16:3e:05:c5:bf 10.100.0.2
Nov 29 02:23:44 np0005539504 nova_compute[187152]: 2025-11-29 07:23:44.203 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:44 np0005539504 NetworkManager[55210]: <info>  [1764401024.2162] device (tap34be3fe8-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:23:44 np0005539504 NetworkManager[55210]: <info>  [1764401024.2174] device (tap34be3fe8-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:23:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:44Z|00446|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 ovn-installed in OVS
Nov 29 02:23:44 np0005539504 nova_compute[187152]: 2025-11-29 07:23:44.232 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:44Z|00447|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 up in Southbound
Nov 29 02:23:44 np0005539504 nova_compute[187152]: 2025-11-29 07:23:44.238 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:44.238 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:c5:bf 10.100.0.2'], port_security=['fa:16:3e:05:c5:bf 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '22632f96-1108-42eb-a410-f31138f282ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '6', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=34be3fe8-368a-49e5-b6b6-2f650c642037) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:44.241 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 34be3fe8-368a-49e5-b6b6-2f650c642037 in datapath b58443a3-f575-4ff1-951d-e92781861793 bound to our chassis#033[00m
Nov 29 02:23:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:44.242 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:23:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:44.244 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bf0abb08-2cf5-48b8-84fc-bb43688e4506]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:44 np0005539504 systemd-machined[153423]: New machine qemu-60-instance-00000077.
Nov 29 02:23:44 np0005539504 systemd[1]: Started Virtual Machine qemu-60-instance-00000077.
Nov 29 02:23:44 np0005539504 nova_compute[187152]: 2025-11-29 07:23:44.563 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.305 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 22632f96-1108-42eb-a410-f31138f282ea due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.306 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401025.305293, 22632f96-1108-42eb-a410-f31138f282ea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.306 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.310 187156 DEBUG nova.compute.manager [None req-e9157d49-055b-42f6-9116-032a99e18680 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.750 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.754 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.782 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.782 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401025.3090708, 22632f96-1108-42eb-a410-f31138f282ea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.783 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.806 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:45 np0005539504 nova_compute[187152]: 2025-11-29 07:23:45.809 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:46 np0005539504 nova_compute[187152]: 2025-11-29 07:23:46.087 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:47 np0005539504 nova_compute[187152]: 2025-11-29 07:23:47.749 187156 DEBUG nova.compute.manager [req-b394ae31-dd64-44d9-af58-cc37caec8296 req-dab540c6-0b1c-45a3-b8b5-88422a069219 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:47 np0005539504 nova_compute[187152]: 2025-11-29 07:23:47.750 187156 DEBUG oslo_concurrency.lockutils [req-b394ae31-dd64-44d9-af58-cc37caec8296 req-dab540c6-0b1c-45a3-b8b5-88422a069219 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:47 np0005539504 nova_compute[187152]: 2025-11-29 07:23:47.750 187156 DEBUG oslo_concurrency.lockutils [req-b394ae31-dd64-44d9-af58-cc37caec8296 req-dab540c6-0b1c-45a3-b8b5-88422a069219 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:47 np0005539504 nova_compute[187152]: 2025-11-29 07:23:47.751 187156 DEBUG oslo_concurrency.lockutils [req-b394ae31-dd64-44d9-af58-cc37caec8296 req-dab540c6-0b1c-45a3-b8b5-88422a069219 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:47 np0005539504 nova_compute[187152]: 2025-11-29 07:23:47.751 187156 DEBUG nova.compute.manager [req-b394ae31-dd64-44d9-af58-cc37caec8296 req-dab540c6-0b1c-45a3-b8b5-88422a069219 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:47 np0005539504 nova_compute[187152]: 2025-11-29 07:23:47.751 187156 WARNING nova.compute.manager [req-b394ae31-dd64-44d9-af58-cc37caec8296 req-dab540c6-0b1c-45a3-b8b5-88422a069219 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:23:48 np0005539504 podman[235613]: 2025-11-29 07:23:48.717203073 +0000 UTC m=+0.060063700 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:23:48 np0005539504 podman[235614]: 2025-11-29 07:23:48.803807414 +0000 UTC m=+0.137347721 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:23:49 np0005539504 nova_compute[187152]: 2025-11-29 07:23:49.599 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:49 np0005539504 nova_compute[187152]: 2025-11-29 07:23:49.749 187156 DEBUG nova.compute.manager [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-changed-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:49 np0005539504 nova_compute[187152]: 2025-11-29 07:23:49.750 187156 DEBUG nova.compute.manager [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Refreshing instance network info cache due to event network-changed-33409140-d169-4701-8e17-6eacddd88f23. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:23:49 np0005539504 nova_compute[187152]: 2025-11-29 07:23:49.750 187156 DEBUG oslo_concurrency.lockutils [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:49 np0005539504 nova_compute[187152]: 2025-11-29 07:23:49.751 187156 DEBUG oslo_concurrency.lockutils [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:49 np0005539504 nova_compute[187152]: 2025-11-29 07:23:49.752 187156 DEBUG nova.network.neutron [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Refreshing network info cache for port 33409140-d169-4701-8e17-6eacddd88f23 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.221 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.221 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.222 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.222 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.222 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.257 187156 INFO nova.compute.manager [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Terminating instance#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.280 187156 DEBUG nova.compute.manager [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:23:50 np0005539504 kernel: tap33409140-d1 (unregistering): left promiscuous mode
Nov 29 02:23:50 np0005539504 NetworkManager[55210]: <info>  [1764401030.3141] device (tap33409140-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:23:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:50Z|00448|binding|INFO|Releasing lport 33409140-d169-4701-8e17-6eacddd88f23 from this chassis (sb_readonly=0)
Nov 29 02:23:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:50Z|00449|binding|INFO|Setting lport 33409140-d169-4701-8e17-6eacddd88f23 down in Southbound
Nov 29 02:23:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:50Z|00450|binding|INFO|Removing iface tap33409140-d1 ovn-installed in OVS
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.323 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.325 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000076.scope: Deactivated successfully.
Nov 29 02:23:50 np0005539504 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000076.scope: Consumed 13.886s CPU time.
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.366 187156 DEBUG nova.compute.manager [req-b8711f68-809c-48bc-93e0-280eeaa3a00a req-6baf9c93-ae8a-4912-9554-2b7f688bc1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:50.363 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:4d:5e 10.100.0.11 2001:db8::f816:3eff:fe3d:4d5e'], port_security=['fa:16:3e:3d:4d:5e 10.100.0.11 2001:db8::f816:3eff:fe3d:4d5e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8::f816:3eff:fe3d:4d5e/64', 'neutron:device_id': '747a4028-af61-495f-9c7d-c5ac869967ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '17fd93d9-fafe-4a7d-9c01-ce54fbe8f760', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=33409140-d169-4701-8e17-6eacddd88f23) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:50.364 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 33409140-d169-4701-8e17-6eacddd88f23 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 unbound from our chassis#033[00m
Nov 29 02:23:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:50.366 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f75dc671-4e0c-40f1-8afd-c16b5e416d95, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:23:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:50.366 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fddf2a76-c71c-4241-bc28-b0c12e22dbd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:50.367 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 namespace which is not needed anymore#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.369 187156 DEBUG oslo_concurrency.lockutils [req-b8711f68-809c-48bc-93e0-280eeaa3a00a req-6baf9c93-ae8a-4912-9554-2b7f688bc1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.370 187156 DEBUG oslo_concurrency.lockutils [req-b8711f68-809c-48bc-93e0-280eeaa3a00a req-6baf9c93-ae8a-4912-9554-2b7f688bc1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.370 187156 DEBUG oslo_concurrency.lockutils [req-b8711f68-809c-48bc-93e0-280eeaa3a00a req-6baf9c93-ae8a-4912-9554-2b7f688bc1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.370 187156 DEBUG nova.compute.manager [req-b8711f68-809c-48bc-93e0-280eeaa3a00a req-6baf9c93-ae8a-4912-9554-2b7f688bc1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.371 187156 WARNING nova.compute.manager [req-b8711f68-809c-48bc-93e0-280eeaa3a00a req-6baf9c93-ae8a-4912-9554-2b7f688bc1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:23:50 np0005539504 systemd-machined[153423]: Machine qemu-58-instance-00000076 terminated.
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.507 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.550 187156 INFO nova.virt.libvirt.driver [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Instance destroyed successfully.#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.550 187156 DEBUG nova.objects.instance [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 747a4028-af61-495f-9c7d-c5ac869967ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.631 187156 DEBUG nova.virt.libvirt.vif [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:22:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-552673723',display_name='tempest-TestGettingAddress-server-552673723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-552673723',id=118,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqBtOVeWFxVzcYFJOuDJtYVuL20oDyqcRBPHq57GiuWFxaCS3KceqmhPXeIi9sFvrUoM3x5G9a+RY7U7UfyTQLwWhQmn8+j5tk7QGxgOZ6WpsSYFLeoEl1770NJZUoryw==',key_name='tempest-TestGettingAddress-2009457088',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-uo1v90l1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:23:23Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=747a4028-af61-495f-9c7d-c5ac869967ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.632 187156 DEBUG nova.network.os_vif_util [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.634 187156 DEBUG nova.network.os_vif_util [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.635 187156 DEBUG os_vif [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:23:50 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [NOTICE]   (235341) : haproxy version is 2.8.14-c23fe91
Nov 29 02:23:50 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [NOTICE]   (235341) : path to executable is /usr/sbin/haproxy
Nov 29 02:23:50 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [WARNING]  (235341) : Exiting Master process...
Nov 29 02:23:50 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [ALERT]    (235341) : Current worker (235343) exited with code 143 (Terminated)
Nov 29 02:23:50 np0005539504 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[235337]: [WARNING]  (235341) : All workers exited. Exiting... (0)
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.640 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.641 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33409140-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:50 np0005539504 systemd[1]: libpod-08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4.scope: Deactivated successfully.
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.698 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539504 podman[235681]: 2025-11-29 07:23:50.700262454 +0000 UTC m=+0.218854085 container died 08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.701 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.705 187156 INFO os_vif [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:4d:5e,bridge_name='br-int',has_traffic_filtering=True,id=33409140-d169-4701-8e17-6eacddd88f23,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33409140-d1')#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.706 187156 INFO nova.virt.libvirt.driver [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Deleting instance files /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab_del#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.707 187156 INFO nova.virt.libvirt.driver [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Deletion of /var/lib/nova/instances/747a4028-af61-495f-9c7d-c5ac869967ab_del complete#033[00m
Nov 29 02:23:50 np0005539504 systemd[1]: var-lib-containers-storage-overlay-a02b12b5b4809a207a01a68a74b938f84c7ee54c7b9137ca8374495c83e1d62f-merged.mount: Deactivated successfully.
Nov 29 02:23:50 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4-userdata-shm.mount: Deactivated successfully.
Nov 29 02:23:50 np0005539504 podman[235681]: 2025-11-29 07:23:50.917250126 +0000 UTC m=+0.435841767 container cleanup 08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:23:50 np0005539504 systemd[1]: libpod-conmon-08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4.scope: Deactivated successfully.
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.955 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.956 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.956 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.957 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.957 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.968 187156 INFO nova.compute.manager [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Terminating instance#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.975 187156 INFO nova.compute.manager [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.976 187156 DEBUG oslo.service.loopingcall [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.976 187156 DEBUG nova.compute.manager [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:23:50 np0005539504 nova_compute[187152]: 2025-11-29 07:23:50.976 187156 DEBUG nova.network.neutron [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.089 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.288 187156 DEBUG nova.compute.manager [req-0b52d505-7451-4c24-a10b-7c9a6ae5c8e3 req-be0e2263-de3a-4334-b66f-522fa328476b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-vif-unplugged-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.289 187156 DEBUG oslo_concurrency.lockutils [req-0b52d505-7451-4c24-a10b-7c9a6ae5c8e3 req-be0e2263-de3a-4334-b66f-522fa328476b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.290 187156 DEBUG oslo_concurrency.lockutils [req-0b52d505-7451-4c24-a10b-7c9a6ae5c8e3 req-be0e2263-de3a-4334-b66f-522fa328476b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.291 187156 DEBUG oslo_concurrency.lockutils [req-0b52d505-7451-4c24-a10b-7c9a6ae5c8e3 req-be0e2263-de3a-4334-b66f-522fa328476b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.291 187156 DEBUG nova.compute.manager [req-0b52d505-7451-4c24-a10b-7c9a6ae5c8e3 req-be0e2263-de3a-4334-b66f-522fa328476b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] No waiting events found dispatching network-vif-unplugged-33409140-d169-4701-8e17-6eacddd88f23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.292 187156 DEBUG nova.compute.manager [req-0b52d505-7451-4c24-a10b-7c9a6ae5c8e3 req-be0e2263-de3a-4334-b66f-522fa328476b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-vif-unplugged-33409140-d169-4701-8e17-6eacddd88f23 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:23:51 np0005539504 podman[235724]: 2025-11-29 07:23:51.322416602 +0000 UTC m=+0.383575058 container remove 08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.321 187156 DEBUG nova.compute.manager [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.332 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e912db-17b0-4ad1-9caf-d7b1dd19990d]: (4, ('Sat Nov 29 07:23:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 (08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4)\n08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4\nSat Nov 29 07:23:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 (08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4)\n08eb34bc443ba9ff3ce7ab8c49974ec40639b0d07e332fd1f988a09b17e8c0b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.335 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[954d5348-beab-4cce-9f1a-2be6abd7cd21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.337 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf75dc671-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.340 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 kernel: tapf75dc671-40: left promiscuous mode
Nov 29 02:23:51 np0005539504 kernel: tap34be3fe8-36 (unregistering): left promiscuous mode
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.363 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 NetworkManager[55210]: <info>  [1764401031.3835] device (tap34be3fe8-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.382 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[95435e79-4551-45c3-84c4-398ffdca0114]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.395 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:51Z|00451|binding|INFO|Releasing lport 34be3fe8-368a-49e5-b6b6-2f650c642037 from this chassis (sb_readonly=0)
Nov 29 02:23:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:51Z|00452|binding|INFO|Setting lport 34be3fe8-368a-49e5-b6b6-2f650c642037 down in Southbound
Nov 29 02:23:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:23:51Z|00453|binding|INFO|Removing iface tap34be3fe8-36 ovn-installed in OVS
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.398 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.409 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:c5:bf 10.100.0.2'], port_security=['fa:16:3e:05:c5:bf 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '22632f96-1108-42eb-a410-f31138f282ea', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '8', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=34be3fe8-368a-49e5-b6b6-2f650c642037) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.420 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.421 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[09a83728-658f-476d-8b9b-4c315db14d66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.423 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3eef485a-7489-4a80-9c20-97feba774a57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.441 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[965c0fc5-a81f-453f-81fa-34f06ebba658]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646410, 'reachable_time': 24145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235742, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 systemd[1]: run-netns-ovnmeta\x2df75dc671\x2d4e0c\x2d40f1\x2d8afd\x2dc16b5e416d95.mount: Deactivated successfully.
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.447 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.447 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[4b384550-f839-4b21-9fe7-0c7c101736d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.448 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 34be3fe8-368a-49e5-b6b6-2f650c642037 in datapath b58443a3-f575-4ff1-951d-e92781861793 unbound from our chassis#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.449 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:23:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:23:51.450 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd3ed3a-fe1e-4aac-bb44-40f4b2f82571]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:51 np0005539504 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000077.scope: Deactivated successfully.
Nov 29 02:23:51 np0005539504 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000077.scope: Consumed 7.185s CPU time.
Nov 29 02:23:51 np0005539504 systemd-machined[153423]: Machine qemu-60-instance-00000077 terminated.
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.601 187156 INFO nova.virt.libvirt.driver [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Instance destroyed successfully.#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.602 187156 DEBUG nova.objects.instance [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'resources' on Instance uuid 22632f96-1108-42eb-a410-f31138f282ea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.646 187156 DEBUG nova.virt.libvirt.vif [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:22:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-287106654',display_name='tempest-ServerRescueTestJSON-server-287106654',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-287106654',id=119,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ajt2ppqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:23:45Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=22632f96-1108-42eb-a410-f31138f282ea,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.647 187156 DEBUG nova.network.os_vif_util [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "34be3fe8-368a-49e5-b6b6-2f650c642037", "address": "fa:16:3e:05:c5:bf", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap34be3fe8-36", "ovs_interfaceid": "34be3fe8-368a-49e5-b6b6-2f650c642037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.648 187156 DEBUG nova.network.os_vif_util [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.649 187156 DEBUG os_vif [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.651 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.652 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34be3fe8-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.655 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.663 187156 INFO os_vif [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:c5:bf,bridge_name='br-int',has_traffic_filtering=True,id=34be3fe8-368a-49e5-b6b6-2f650c642037,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap34be3fe8-36')#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.664 187156 INFO nova.virt.libvirt.driver [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Deleting instance files /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea_del#033[00m
Nov 29 02:23:51 np0005539504 nova_compute[187152]: 2025-11-29 07:23:51.665 187156 INFO nova.virt.libvirt.driver [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Deletion of /var/lib/nova/instances/22632f96-1108-42eb-a410-f31138f282ea_del complete#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.678 187156 INFO nova.compute.manager [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Took 1.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.679 187156 DEBUG oslo.service.loopingcall [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.679 187156 DEBUG nova.compute.manager [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.680 187156 DEBUG nova.network.neutron [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.884 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.885 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.886 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.886 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.887 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.887 187156 WARNING nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.888 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.888 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.889 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.889 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.890 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.891 187156 WARNING nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.891 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.892 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.892 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.893 187156 DEBUG oslo_concurrency.lockutils [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.894 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:52 np0005539504 nova_compute[187152]: 2025-11-29 07:23:52.894 187156 DEBUG nova.compute.manager [req-4a913e01-6dff-46d3-8155-82d80c721bd4 req-21a8affd-a1dd-4369-a591-05111f209871 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-unplugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.029 187156 DEBUG nova.network.neutron [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updated VIF entry in instance network info cache for port 33409140-d169-4701-8e17-6eacddd88f23. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.030 187156 DEBUG nova.network.neutron [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updating instance_info_cache with network_info: [{"id": "33409140-d169-4701-8e17-6eacddd88f23", "address": "fa:16:3e:3d:4d:5e", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3d:4d5e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33409140-d1", "ovs_interfaceid": "33409140-d169-4701-8e17-6eacddd88f23", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.105 187156 DEBUG oslo_concurrency.lockutils [req-2a3543d2-250b-4172-831f-4ce334ff8a4c req-c2702721-f802-4df6-81bf-7c332f026a43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-747a4028-af61-495f-9c7d-c5ac869967ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.506 187156 DEBUG nova.network.neutron [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.539 187156 INFO nova.compute.manager [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Took 2.56 seconds to deallocate network for instance.#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.998 187156 DEBUG nova.compute.manager [req-77249a0e-dd8f-4b87-98e2-88a35567a7da req-8e4d87f6-070e-48f6-999b-670efb52d6bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.998 187156 DEBUG oslo_concurrency.lockutils [req-77249a0e-dd8f-4b87-98e2-88a35567a7da req-8e4d87f6-070e-48f6-999b-670efb52d6bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:53 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.999 187156 DEBUG oslo_concurrency.lockutils [req-77249a0e-dd8f-4b87-98e2-88a35567a7da req-8e4d87f6-070e-48f6-999b-670efb52d6bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.999 187156 DEBUG oslo_concurrency.lockutils [req-77249a0e-dd8f-4b87-98e2-88a35567a7da req-8e4d87f6-070e-48f6-999b-670efb52d6bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.999 187156 DEBUG nova.compute.manager [req-77249a0e-dd8f-4b87-98e2-88a35567a7da req-8e4d87f6-070e-48f6-999b-670efb52d6bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] No waiting events found dispatching network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:53.999 187156 WARNING nova.compute.manager [req-77249a0e-dd8f-4b87-98e2-88a35567a7da req-8e4d87f6-070e-48f6-999b-670efb52d6bb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received unexpected event network-vif-plugged-33409140-d169-4701-8e17-6eacddd88f23 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.001 187156 DEBUG nova.compute.manager [req-2b1fd811-7c79-4788-90e4-3ccc1c65686a req-6642099a-c857-4a7d-b953-1d233d6791d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Received event network-vif-deleted-33409140-d169-4701-8e17-6eacddd88f23 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.070 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.189 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.189 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.270 187156 DEBUG nova.compute.provider_tree [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.444 187156 DEBUG nova.scheduler.client.report [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.543 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:54 np0005539504 nova_compute[187152]: 2025-11-29 07:23:54.549 187156 DEBUG nova.network.neutron [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.039 187156 INFO nova.scheduler.client.report [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 747a4028-af61-495f-9c7d-c5ac869967ab#033[00m
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.042 187156 INFO nova.compute.manager [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Took 2.36 seconds to deallocate network for instance.#033[00m
Nov 29 02:23:55 np0005539504 podman[235760]: 2025-11-29 07:23:55.723933616 +0000 UTC m=+0.068050604 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.891 187156 DEBUG oslo_concurrency.lockutils [None req-32d46e10-b886-4a4d-822c-76b4681a542f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "747a4028-af61-495f-9c7d-c5ac869967ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.910 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.911 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.969 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:55 np0005539504 nova_compute[187152]: 2025-11-29 07:23:55.981 187156 DEBUG nova.compute.provider_tree [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.012 187156 DEBUG nova.scheduler.client.report [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.051 187156 DEBUG nova.compute.manager [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.052 187156 DEBUG oslo_concurrency.lockutils [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "22632f96-1108-42eb-a410-f31138f282ea-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.052 187156 DEBUG oslo_concurrency.lockutils [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.052 187156 DEBUG oslo_concurrency.lockutils [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.052 187156 DEBUG nova.compute.manager [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] No waiting events found dispatching network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.053 187156 WARNING nova.compute.manager [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received unexpected event network-vif-plugged-34be3fe8-368a-49e5-b6b6-2f650c642037 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.053 187156 DEBUG nova.compute.manager [req-c7772b7b-bba4-4c95-b6d4-69f0bfdfd634 req-1aadd953-9a2b-4f58-bc8a-c4614cb3a12a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Received event network-vif-deleted-34be3fe8-368a-49e5-b6b6-2f650c642037 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.053 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.082 187156 INFO nova.scheduler.client.report [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Deleted allocations for instance 22632f96-1108-42eb-a410-f31138f282ea#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.090 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.234 187156 DEBUG oslo_concurrency.lockutils [None req-a944b032-c2ea-4041-ab2b-dbac336ae659 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "22632f96-1108-42eb-a410-f31138f282ea" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:56 np0005539504 nova_compute[187152]: 2025-11-29 07:23:56.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.409 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.410 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.754 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.955 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.956 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.964 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:24:00 np0005539504 nova_compute[187152]: 2025-11-29 07:24:00.965 187156 INFO nova.compute.claims [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.093 187156 DEBUG nova.compute.provider_tree [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.114 187156 DEBUG nova.scheduler.client.report [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.136 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.137 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.198 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.199 187156 DEBUG nova.network.neutron [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.218 187156 INFO nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.246 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.369 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.370 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.371 187156 INFO nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Creating image(s)#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.371 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.372 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.372 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.384 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.453 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.454 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.455 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.472 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.504 187156 DEBUG nova.policy [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.568 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.569 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.607 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.608 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.609 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.656 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.688 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.689 187156 DEBUG nova.virt.disk.api [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.689 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.756 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.758 187156 DEBUG nova.virt.disk.api [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.758 187156 DEBUG nova.objects.instance [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.784 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.784 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Ensure instance console log exists: /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.785 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.786 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:01 np0005539504 nova_compute[187152]: 2025-11-29 07:24:01.786 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:03 np0005539504 nova_compute[187152]: 2025-11-29 07:24:03.142 187156 DEBUG nova.network.neutron [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Successfully created port: e89dd8de-f981-46cf-aa04-cfad6a9b2326 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:24:03 np0005539504 podman[235797]: 2025-11-29 07:24:03.756738241 +0000 UTC m=+0.090991629 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:24:04 np0005539504 nova_compute[187152]: 2025-11-29 07:24:04.189 187156 DEBUG nova.network.neutron [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Successfully updated port: e89dd8de-f981-46cf-aa04-cfad6a9b2326 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:24:04 np0005539504 nova_compute[187152]: 2025-11-29 07:24:04.204 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:04 np0005539504 nova_compute[187152]: 2025-11-29 07:24:04.204 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:04 np0005539504 nova_compute[187152]: 2025-11-29 07:24:04.205 187156 DEBUG nova.network.neutron [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:24:04 np0005539504 nova_compute[187152]: 2025-11-29 07:24:04.433 187156 DEBUG nova.network.neutron [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.228 187156 DEBUG nova.compute.manager [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.228 187156 DEBUG nova.compute.manager [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing instance network info cache due to event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.229 187156 DEBUG oslo_concurrency.lockutils [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.549 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401030.5476723, 747a4028-af61-495f-9c7d-c5ac869967ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.550 187156 INFO nova.compute.manager [-] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.576 187156 DEBUG nova.compute.manager [None req-a7a04d71-e591-4a71-81a5-2bde33a4986c - - - - - -] [instance: 747a4028-af61-495f-9c7d-c5ac869967ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:05 np0005539504 nova_compute[187152]: 2025-11-29 07:24:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.094 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.599 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401031.5989747, 22632f96-1108-42eb-a410-f31138f282ea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.600 187156 INFO nova.compute.manager [-] [instance: 22632f96-1108-42eb-a410-f31138f282ea] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.608 187156 DEBUG nova.network.neutron [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.622 187156 DEBUG nova.compute.manager [None req-1bcf27f9-7123-4616-994e-7f4a8c1412b9 - - - - - -] [instance: 22632f96-1108-42eb-a410-f31138f282ea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.631 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.631 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance network_info: |[{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.631 187156 DEBUG oslo_concurrency.lockutils [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.632 187156 DEBUG nova.network.neutron [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.634 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Start _get_guest_xml network_info=[{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.639 187156 WARNING nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.650 187156 DEBUG nova.virt.libvirt.host [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.651 187156 DEBUG nova.virt.libvirt.host [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.654 187156 DEBUG nova.virt.libvirt.host [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.655 187156 DEBUG nova.virt.libvirt.host [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.656 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.657 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.657 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.657 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.658 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.658 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.658 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.658 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.658 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.658 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.659 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.659 187156 DEBUG nova.virt.hardware [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.664 187156 DEBUG nova.virt.libvirt.vif [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:01Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.664 187156 DEBUG nova.network.os_vif_util [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.667 187156 DEBUG nova.network.os_vif_util [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.669 187156 DEBUG nova.objects.instance [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.670 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.695 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <uuid>7c10cb24-586c-4507-8169-8258d7136397</uuid>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <name>instance-00000079</name>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1276368768</nova:name>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:24:06</nova:creationTime>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        <nova:port uuid="e89dd8de-f981-46cf-aa04-cfad6a9b2326">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <entry name="serial">7c10cb24-586c-4507-8169-8258d7136397</entry>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <entry name="uuid">7c10cb24-586c-4507-8169-8258d7136397</entry>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:b1:6e:42"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <target dev="tape89dd8de-f9"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/console.log" append="off"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:24:06 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:24:06 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:24:06 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:24:06 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.697 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Preparing to wait for external event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.698 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.698 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.698 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.699 187156 DEBUG nova.virt.libvirt.vif [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:01Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.700 187156 DEBUG nova.network.os_vif_util [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.701 187156 DEBUG nova.network.os_vif_util [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.701 187156 DEBUG os_vif [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.702 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.703 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.703 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.707 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.707 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape89dd8de-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.708 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape89dd8de-f9, col_values=(('external_ids', {'iface-id': 'e89dd8de-f981-46cf-aa04-cfad6a9b2326', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:6e:42', 'vm-uuid': '7c10cb24-586c-4507-8169-8258d7136397'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:06 np0005539504 NetworkManager[55210]: <info>  [1764401046.7127] manager: (tape89dd8de-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.714 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.717 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.719 187156 INFO os_vif [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9')#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.775 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.776 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.776 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:b1:6e:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.777 187156 INFO nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Using config drive#033[00m
Nov 29 02:24:06 np0005539504 nova_compute[187152]: 2025-11-29 07:24:06.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:07 np0005539504 nova_compute[187152]: 2025-11-29 07:24:07.573 187156 INFO nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Creating config drive at /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config#033[00m
Nov 29 02:24:07 np0005539504 nova_compute[187152]: 2025-11-29 07:24:07.580 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ckt2bt0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:07 np0005539504 nova_compute[187152]: 2025-11-29 07:24:07.711 187156 DEBUG oslo_concurrency.processutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ckt2bt0" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:07 np0005539504 kernel: tape89dd8de-f9: entered promiscuous mode
Nov 29 02:24:07 np0005539504 NetworkManager[55210]: <info>  [1764401047.7934] manager: (tape89dd8de-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Nov 29 02:24:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:07Z|00454|binding|INFO|Claiming lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 for this chassis.
Nov 29 02:24:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:07Z|00455|binding|INFO|e89dd8de-f981-46cf-aa04-cfad6a9b2326: Claiming fa:16:3e:b1:6e:42 10.100.0.7
Nov 29 02:24:07 np0005539504 nova_compute[187152]: 2025-11-29 07:24:07.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.817 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:6e:42 10.100.0.7'], port_security=['fa:16:3e:b1:6e:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7c10cb24-586c-4507-8169-8258d7136397', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '51b81e59-c129-44d0-83ab-ea09f800f560', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40c46f88-56a4-469c-8869-7f0629f57469, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e89dd8de-f981-46cf-aa04-cfad6a9b2326) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.819 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e89dd8de-f981-46cf-aa04-cfad6a9b2326 in datapath be5e5e17-de26-4f07-84cb-bd99be23cd24 bound to our chassis#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.820 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network be5e5e17-de26-4f07-84cb-bd99be23cd24#033[00m
Nov 29 02:24:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:07Z|00456|binding|INFO|Setting lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 ovn-installed in OVS
Nov 29 02:24:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:07Z|00457|binding|INFO|Setting lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 up in Southbound
Nov 29 02:24:07 np0005539504 nova_compute[187152]: 2025-11-29 07:24:07.833 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:07 np0005539504 nova_compute[187152]: 2025-11-29 07:24:07.836 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.837 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d073776f-428f-4412-bed1-350847788342]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.837 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbe5e5e17-d1 in ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.841 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbe5e5e17-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.842 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e79bd98b-bdb5-4faa-8d63-3445d285803a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.844 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[faa2063b-0b0e-43fc-a10b-309d032b67d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 systemd-machined[153423]: New machine qemu-61-instance-00000079.
Nov 29 02:24:07 np0005539504 systemd-udevd[235840]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.862 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9da857-ef5b-4be0-91b1-1e13b0689ed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 NetworkManager[55210]: <info>  [1764401047.8695] device (tape89dd8de-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:24:07 np0005539504 systemd[1]: Started Virtual Machine qemu-61-instance-00000079.
Nov 29 02:24:07 np0005539504 NetworkManager[55210]: <info>  [1764401047.8711] device (tape89dd8de-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.876 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3e8311-0576-4b5d-8fee-8a3d789b3fa9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.913 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed55000-f8c8-411e-b79a-92fb1fc5d057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.920 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d3389837-30c3-4139-ac02-e39ebbe38194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 NetworkManager[55210]: <info>  [1764401047.9218] manager: (tapbe5e5e17-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/208)
Nov 29 02:24:07 np0005539504 systemd-udevd[235844]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.960 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6e6051-1d59-40ae-bb82-70ad484cd29a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:07.965 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d69b49ab-6b73-4876-b616-e79389387c01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:07 np0005539504 NetworkManager[55210]: <info>  [1764401047.9895] device (tapbe5e5e17-d0): carrier: link connected
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.000 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[338e5306-8106-44bb-bae7-f377c4ca140d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.019 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a4142cc9-f50c-4fec-b91d-81ce665e94ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbe5e5e17-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:a9:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651086, 'reachable_time': 35107, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235872, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.036 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[65d05acd-2298-4b36-bf7a-7fbe7b86fad8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:a90d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651086, 'tstamp': 651086}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235873, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.053 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[27b103aa-eac5-45e6-9ad2-b149f5406fd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbe5e5e17-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:a9:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651086, 'reachable_time': 35107, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235874, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.085 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[06ca8509-b5a2-4e48-b944-d4f0427590c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.152 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3558596e-447a-4ef5-bfa7-4fb212e667b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.154 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe5e5e17-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.154 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.155 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe5e5e17-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:08 np0005539504 NetworkManager[55210]: <info>  [1764401048.1575] manager: (tapbe5e5e17-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:08 np0005539504 kernel: tapbe5e5e17-d0: entered promiscuous mode
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.162 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbe5e5e17-d0, col_values=(('external_ids', {'iface-id': '2da41e48-a12e-440c-815f-4c44c48f8762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.162 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:08Z|00458|binding|INFO|Releasing lport 2da41e48-a12e-440c-815f-4c44c48f8762 from this chassis (sb_readonly=0)
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.164 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/be5e5e17-de26-4f07-84cb-bd99be23cd24.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/be5e5e17-de26-4f07-84cb-bd99be23cd24.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.165 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0a25241a-eb51-456a-9b03-6852413ca776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.166 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-be5e5e17-de26-4f07-84cb-bd99be23cd24
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/be5e5e17-de26-4f07-84cb-bd99be23cd24.pid.haproxy
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID be5e5e17-de26-4f07-84cb-bd99be23cd24
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:24:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:08.166 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'env', 'PROCESS_TAG=haproxy-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/be5e5e17-de26-4f07-84cb-bd99be23cd24.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.176 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.221 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401048.2211592, 7c10cb24-586c-4507-8169-8258d7136397 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.222 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Started (Lifecycle Event)#033[00m
Nov 29 02:24:08 np0005539504 podman[235913]: 2025-11-29 07:24:08.603038531 +0000 UTC m=+0.047708809 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.868 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.876 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401048.2213647, 7c10cb24-586c-4507-8169-8258d7136397 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.876 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.903 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.908 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:24:08 np0005539504 podman[235913]: 2025-11-29 07:24:08.91392118 +0000 UTC m=+0.358591458 container create e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 02:24:08 np0005539504 nova_compute[187152]: 2025-11-29 07:24:08.934 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:24:08 np0005539504 systemd[1]: Started libpod-conmon-e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761.scope.
Nov 29 02:24:09 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:24:09 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72f66d2be086efed48be70315943753e1080a6d54502e578bc19ddc8be41be4f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:24:09 np0005539504 podman[235913]: 2025-11-29 07:24:09.038692202 +0000 UTC m=+0.483362510 container init e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:24:09 np0005539504 podman[235913]: 2025-11-29 07:24:09.048260849 +0000 UTC m=+0.492931057 container start e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:24:09 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [NOTICE]   (235933) : New worker (235935) forked
Nov 29 02:24:09 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [NOTICE]   (235933) : Loading success.
Nov 29 02:24:09 np0005539504 nova_compute[187152]: 2025-11-29 07:24:09.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:09 np0005539504 nova_compute[187152]: 2025-11-29 07:24:09.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:24:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:10Z|00459|binding|INFO|Releasing lport 2da41e48-a12e-440c-815f-4c44c48f8762 from this chassis (sb_readonly=0)
Nov 29 02:24:10 np0005539504 nova_compute[187152]: 2025-11-29 07:24:10.193 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:10Z|00460|binding|INFO|Releasing lport 2da41e48-a12e-440c-815f-4c44c48f8762 from this chassis (sb_readonly=0)
Nov 29 02:24:10 np0005539504 nova_compute[187152]: 2025-11-29 07:24:10.385 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:10 np0005539504 nova_compute[187152]: 2025-11-29 07:24:10.549 187156 DEBUG nova.network.neutron [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated VIF entry in instance network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:24:10 np0005539504 nova_compute[187152]: 2025-11-29 07:24:10.550 187156 DEBUG nova.network.neutron [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:10 np0005539504 nova_compute[187152]: 2025-11-29 07:24:10.575 187156 DEBUG oslo_concurrency.lockutils [req-a8cdd81d-f9f2-4ecb-9ffd-4acb716ae65a req-b39359a6-713b-4140-81b9-109623cab8f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.077 187156 DEBUG nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.077 187156 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.078 187156 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.078 187156 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.079 187156 DEBUG nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Processing event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.079 187156 DEBUG nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.079 187156 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.079 187156 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.080 187156 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.080 187156 DEBUG nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.080 187156 WARNING nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.081 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.085 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401051.085496, 7c10cb24-586c-4507-8169-8258d7136397 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.086 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.087 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.090 187156 INFO nova.virt.libvirt.driver [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance spawned successfully.#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.091 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.107 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.110 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.120 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.120 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.121 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.121 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.122 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.122 187156 DEBUG nova.virt.libvirt.driver [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.133 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.213 187156 INFO nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Took 9.84 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.213 187156 DEBUG nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.317 187156 INFO nova.compute.manager [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Took 10.39 seconds to build instance.#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.340 187156 DEBUG oslo_concurrency.lockutils [None req-334f978c-6bba-4c1f-82f1-4f712cfba34b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.713 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:11 np0005539504 nova_compute[187152]: 2025-11-29 07:24:11.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:12 np0005539504 nova_compute[187152]: 2025-11-29 07:24:12.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:12 np0005539504 nova_compute[187152]: 2025-11-29 07:24:12.958 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:12 np0005539504 nova_compute[187152]: 2025-11-29 07:24:12.958 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:12 np0005539504 nova_compute[187152]: 2025-11-29 07:24:12.958 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:12 np0005539504 nova_compute[187152]: 2025-11-29 07:24:12.959 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.038 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.113 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.114 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.167 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.422 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.424 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5586MB free_disk=73.19153213500977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.424 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.425 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.578 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 7c10cb24-586c-4507-8169-8258d7136397 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.578 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.579 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.642 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.660 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.694 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:24:13 np0005539504 nova_compute[187152]: 2025-11-29 07:24:13.695 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:14 np0005539504 nova_compute[187152]: 2025-11-29 07:24:14.696 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:14 np0005539504 podman[235953]: 2025-11-29 07:24:14.721821834 +0000 UTC m=+0.055127728 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 29 02:24:14 np0005539504 podman[235954]: 2025-11-29 07:24:14.724204498 +0000 UTC m=+0.051937923 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:24:14 np0005539504 podman[235952]: 2025-11-29 07:24:14.737817972 +0000 UTC m=+0.075239976 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:24:14 np0005539504 nova_compute[187152]: 2025-11-29 07:24:14.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:15 np0005539504 nova_compute[187152]: 2025-11-29 07:24:15.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:15 np0005539504 nova_compute[187152]: 2025-11-29 07:24:15.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:24:15 np0005539504 nova_compute[187152]: 2025-11-29 07:24:15.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:24:16 np0005539504 nova_compute[187152]: 2025-11-29 07:24:16.098 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:16 np0005539504 nova_compute[187152]: 2025-11-29 07:24:16.260 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:16 np0005539504 nova_compute[187152]: 2025-11-29 07:24:16.261 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:16 np0005539504 nova_compute[187152]: 2025-11-29 07:24:16.261 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:24:16 np0005539504 nova_compute[187152]: 2025-11-29 07:24:16.261 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:16 np0005539504 nova_compute[187152]: 2025-11-29 07:24:16.715 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:17 np0005539504 nova_compute[187152]: 2025-11-29 07:24:17.092 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:17 np0005539504 NetworkManager[55210]: <info>  [1764401057.1099] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Nov 29 02:24:17 np0005539504 NetworkManager[55210]: <info>  [1764401057.1107] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Nov 29 02:24:17 np0005539504 nova_compute[187152]: 2025-11-29 07:24:17.193 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:17Z|00461|binding|INFO|Releasing lport 2da41e48-a12e-440c-815f-4c44c48f8762 from this chassis (sb_readonly=0)
Nov 29 02:24:17 np0005539504 nova_compute[187152]: 2025-11-29 07:24:17.213 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.270 187156 DEBUG nova.compute.manager [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.271 187156 DEBUG nova.compute.manager [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing instance network info cache due to event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.272 187156 DEBUG oslo_concurrency.lockutils [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.449 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:19 np0005539504 podman[236013]: 2025-11-29 07:24:19.725928433 +0000 UTC m=+0.062358992 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.749 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.750 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.750 187156 DEBUG oslo_concurrency.lockutils [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:19 np0005539504 nova_compute[187152]: 2025-11-29 07:24:19.751 187156 DEBUG nova.network.neutron [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:24:19 np0005539504 podman[236014]: 2025-11-29 07:24:19.787451311 +0000 UTC m=+0.116218885 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:24:21 np0005539504 nova_compute[187152]: 2025-11-29 07:24:21.100 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:21 np0005539504 nova_compute[187152]: 2025-11-29 07:24:21.717 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:22.964 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:22.965 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:22.965 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:23 np0005539504 nova_compute[187152]: 2025-11-29 07:24:23.418 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:23 np0005539504 nova_compute[187152]: 2025-11-29 07:24:23.745 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:23 np0005539504 nova_compute[187152]: 2025-11-29 07:24:23.756 187156 DEBUG nova.network.neutron [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated VIF entry in instance network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:24:23 np0005539504 nova_compute[187152]: 2025-11-29 07:24:23.757 187156 DEBUG nova.network.neutron [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:23 np0005539504 nova_compute[187152]: 2025-11-29 07:24:23.786 187156 DEBUG oslo_concurrency.lockutils [req-7d82f472-23f9-478b-8fa0-2262013eeff8 req-1456c6a6-1ec6-4831-b45a-1d8092a5ddb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:23Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:6e:42 10.100.0.7
Nov 29 02:24:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:23Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:6e:42 10.100.0.7
Nov 29 02:24:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:23.943 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:23 np0005539504 nova_compute[187152]: 2025-11-29 07:24:23.944 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:23.945 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:24:26 np0005539504 nova_compute[187152]: 2025-11-29 07:24:26.102 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:26 np0005539504 nova_compute[187152]: 2025-11-29 07:24:26.720 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:26 np0005539504 podman[236079]: 2025-11-29 07:24:26.748601293 +0000 UTC m=+0.075751111 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 29 02:24:30 np0005539504 nova_compute[187152]: 2025-11-29 07:24:30.291 187156 INFO nova.compute.manager [None req-1ae9cd7c-5567-4eb9-85de-3c43a61330cf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Get console output#033[00m
Nov 29 02:24:30 np0005539504 nova_compute[187152]: 2025-11-29 07:24:30.297 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:24:30 np0005539504 nova_compute[187152]: 2025-11-29 07:24:30.430 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:30 np0005539504 nova_compute[187152]: 2025-11-29 07:24:30.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:30.949 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:31 np0005539504 nova_compute[187152]: 2025-11-29 07:24:31.107 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:31 np0005539504 nova_compute[187152]: 2025-11-29 07:24:31.774 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539504 nova_compute[187152]: 2025-11-29 07:24:33.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:34 np0005539504 podman[236099]: 2025-11-29 07:24:34.745433102 +0000 UTC m=+0.079428329 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:24:36 np0005539504 nova_compute[187152]: 2025-11-29 07:24:36.171 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:36 np0005539504 nova_compute[187152]: 2025-11-29 07:24:36.519 187156 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:36 np0005539504 nova_compute[187152]: 2025-11-29 07:24:36.520 187156 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:36 np0005539504 nova_compute[187152]: 2025-11-29 07:24:36.520 187156 DEBUG nova.network.neutron [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:24:36 np0005539504 nova_compute[187152]: 2025-11-29 07:24:36.777 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.060 187156 DEBUG nova.network.neutron [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.088 187156 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.235 187156 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.235 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Creating file /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/7f30a22fee2b4fe49a81d880fe6ef025.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.236 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/7f30a22fee2b4fe49a81d880fe6ef025.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.450 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/7f30a22fee2b4fe49a81d880fe6ef025.tmp" returned: 1 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.452 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/7f30a22fee2b4fe49a81d880fe6ef025.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.452 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Creating directory /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.452 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.660 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:38 np0005539504 nova_compute[187152]: 2025-11-29 07:24:38.667 187156 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:24:39 np0005539504 nova_compute[187152]: 2025-11-29 07:24:39.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:40 np0005539504 kernel: tape89dd8de-f9 (unregistering): left promiscuous mode
Nov 29 02:24:40 np0005539504 NetworkManager[55210]: <info>  [1764401080.9963] device (tape89dd8de-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.028 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:41Z|00462|binding|INFO|Releasing lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 from this chassis (sb_readonly=0)
Nov 29 02:24:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:41Z|00463|binding|INFO|Setting lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 down in Southbound
Nov 29 02:24:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:24:41Z|00464|binding|INFO|Removing iface tape89dd8de-f9 ovn-installed in OVS
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.039 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:6e:42 10.100.0.7'], port_security=['fa:16:3e:b1:6e:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7c10cb24-586c-4507-8169-8258d7136397', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '51b81e59-c129-44d0-83ab-ea09f800f560', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40c46f88-56a4-469c-8869-7f0629f57469, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e89dd8de-f981-46cf-aa04-cfad6a9b2326) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.040 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e89dd8de-f981-46cf-aa04-cfad6a9b2326 in datapath be5e5e17-de26-4f07-84cb-bd99be23cd24 unbound from our chassis#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.042 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network be5e5e17-de26-4f07-84cb-bd99be23cd24, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.044 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.045 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[908320bd-116f-4887-b6ef-51420f885b89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.046 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 namespace which is not needed anymore#033[00m
Nov 29 02:24:41 np0005539504 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000079.scope: Deactivated successfully.
Nov 29 02:24:41 np0005539504 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000079.scope: Consumed 14.067s CPU time.
Nov 29 02:24:41 np0005539504 systemd-machined[153423]: Machine qemu-61-instance-00000079 terminated.
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.173 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [NOTICE]   (235933) : haproxy version is 2.8.14-c23fe91
Nov 29 02:24:41 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [NOTICE]   (235933) : path to executable is /usr/sbin/haproxy
Nov 29 02:24:41 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [WARNING]  (235933) : Exiting Master process...
Nov 29 02:24:41 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [ALERT]    (235933) : Current worker (235935) exited with code 143 (Terminated)
Nov 29 02:24:41 np0005539504 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[235929]: [WARNING]  (235933) : All workers exited. Exiting... (0)
Nov 29 02:24:41 np0005539504 systemd[1]: libpod-e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761.scope: Deactivated successfully.
Nov 29 02:24:41 np0005539504 podman[236147]: 2025-11-29 07:24:41.193186159 +0000 UTC m=+0.050392541 container died e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:24:41 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761-userdata-shm.mount: Deactivated successfully.
Nov 29 02:24:41 np0005539504 systemd[1]: var-lib-containers-storage-overlay-72f66d2be086efed48be70315943753e1080a6d54502e578bc19ddc8be41be4f-merged.mount: Deactivated successfully.
Nov 29 02:24:41 np0005539504 podman[236147]: 2025-11-29 07:24:41.237753714 +0000 UTC m=+0.094960076 container cleanup e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:24:41 np0005539504 systemd[1]: libpod-conmon-e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761.scope: Deactivated successfully.
Nov 29 02:24:41 np0005539504 podman[236178]: 2025-11-29 07:24:41.472835111 +0000 UTC m=+0.211639290 container remove e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.482 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[131f3526-aa43-4d24-a58f-fcb76e0174d5]: (4, ('Sat Nov 29 07:24:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 (e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761)\ne9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761\nSat Nov 29 07:24:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 (e9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761)\ne9dc979654a14e749bb28209872fa7c9806f897abe663e0937318f0a0140d761\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.485 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cffd16bd-aa0e-4b81-ab2a-f8883c069be9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.487 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe5e5e17-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.490 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 kernel: tapbe5e5e17-d0: left promiscuous mode
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.508 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.512 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1824b040-81ff-4ce5-9144-cbfe81abfeb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.533 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9af3e5bd-9637-4282-a494-01bdbf28bff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.534 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[efb35d06-b09d-425b-8a0a-fd22bfdf7c91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.556 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[30cf4522-07df-46e3-b477-c74e2f666e81]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651078, 'reachable_time': 26385, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236213, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 systemd[1]: run-netns-ovnmeta\x2dbe5e5e17\x2dde26\x2d4f07\x2d84cb\x2dbd99be23cd24.mount: Deactivated successfully.
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.561 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:24:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:24:41.562 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[47ba6683-e60a-4c85-a8b2-d18da4794a19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.691 187156 INFO nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.697 187156 INFO nova.virt.libvirt.driver [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance destroyed successfully.#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.698 187156 DEBUG nova.virt.libvirt.vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:24:35Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.699 187156 DEBUG nova.network.os_vif_util [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.699 187156 DEBUG nova.network.os_vif_util [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.700 187156 DEBUG os_vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.702 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.702 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape89dd8de-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.705 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.711 187156 INFO os_vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9')#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.716 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.780 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.781 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.882 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.885 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Copying file /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk to 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:24:41 np0005539504 nova_compute[187152]: 2025-11-29 07:24:41.886 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:42 np0005539504 nova_compute[187152]: 2025-11-29 07:24:42.559 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "scp -r /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk" returned: 0 in 0.674s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:42 np0005539504 nova_compute[187152]: 2025-11-29 07:24:42.560 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Copying file /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:24:42 np0005539504 nova_compute[187152]: 2025-11-29 07:24:42.561 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk.config 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.060 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "scp -C -r /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk.config 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.061 187156 DEBUG nova.virt.libvirt.volume.remotefs [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Copying file /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.062 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk.info 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.314 187156 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "scp -C -r /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_resize/disk.info 192.168.122.100:/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.568 187156 DEBUG neutronclient.v2_0.client [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.720 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.720 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.725 187156 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.725 187156 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.726 187156 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.740 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.830 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.831 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.840 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.841 187156 INFO nova.compute.claims [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.933 187156 DEBUG nova.scheduler.client.report [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.967 187156 DEBUG nova.scheduler.client.report [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:24:43 np0005539504 nova_compute[187152]: 2025-11-29 07:24:43.968 187156 DEBUG nova.compute.provider_tree [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.002 187156 DEBUG nova.scheduler.client.report [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.043 187156 DEBUG nova.scheduler.client.report [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.117 187156 DEBUG nova.compute.provider_tree [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.142 187156 DEBUG nova.scheduler.client.report [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.162 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.163 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.218 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.219 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.260 187156 INFO nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.289 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.403 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.405 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.405 187156 INFO nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Creating image(s)#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.406 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.406 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.407 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.424 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.498 187156 DEBUG nova.compute.manager [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.501 187156 DEBUG oslo_concurrency.lockutils [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.501 187156 DEBUG oslo_concurrency.lockutils [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.501 187156 DEBUG oslo_concurrency.lockutils [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.502 187156 DEBUG nova.compute.manager [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.502 187156 WARNING nova.compute.manager [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.504 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.504 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.505 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.518 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.574 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.575 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.613 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.614 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.614 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.683 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.684 187156 DEBUG nova.virt.disk.api [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.684 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.733 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.734 187156 DEBUG nova.virt.disk.api [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:24:44 np0005539504 nova_compute[187152]: 2025-11-29 07:24:44.735 187156 DEBUG nova.objects.instance [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid a047dabb-8e55-4bea-92aa-20b191da7b54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:45 np0005539504 nova_compute[187152]: 2025-11-29 07:24:45.607 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:24:45 np0005539504 nova_compute[187152]: 2025-11-29 07:24:45.607 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Ensure instance console log exists: /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:24:45 np0005539504 nova_compute[187152]: 2025-11-29 07:24:45.608 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:45 np0005539504 nova_compute[187152]: 2025-11-29 07:24:45.608 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:45 np0005539504 nova_compute[187152]: 2025-11-29 07:24:45.609 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:45 np0005539504 podman[236243]: 2025-11-29 07:24:45.742450771 +0000 UTC m=+0.070276994 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 02:24:45 np0005539504 podman[236242]: 2025-11-29 07:24:45.742452501 +0000 UTC m=+0.074689892 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:24:45 np0005539504 podman[236241]: 2025-11-29 07:24:45.764509692 +0000 UTC m=+0.091406490 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:24:46 np0005539504 nova_compute[187152]: 2025-11-29 07:24:46.226 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:46 np0005539504 nova_compute[187152]: 2025-11-29 07:24:46.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.978 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7c10cb24-586c-4507-8169-8258d7136397', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1276368768', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000079', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': '37598f96b097077786e51c7aa3b978a8021f173a67f4e81e03a7454d', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.981 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.982 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.983 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.984 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.985 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.986 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.987 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.988 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.989 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.989 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>]
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.990 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.991 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.992 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.993 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.994 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.995 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.996 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.997 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.998 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.999 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:47.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.000 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.000 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.000 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>]
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.001 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.002 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.002 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>]
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.003 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.003 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.004 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1276368768>]
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:24:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:24:48.005 12 DEBUG ceilometer.compute.pollsters [-] Instance 7c10cb24-586c-4507-8169-8258d7136397 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000079, id=7c10cb24-586c-4507-8169-8258d7136397>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:24:49 np0005539504 nova_compute[187152]: 2025-11-29 07:24:49.273 187156 DEBUG nova.policy [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:24:50 np0005539504 nova_compute[187152]: 2025-11-29 07:24:50.314 187156 DEBUG nova.compute.manager [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:50 np0005539504 nova_compute[187152]: 2025-11-29 07:24:50.314 187156 DEBUG nova.compute.manager [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing instance network info cache due to event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:24:50 np0005539504 nova_compute[187152]: 2025-11-29 07:24:50.315 187156 DEBUG oslo_concurrency.lockutils [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:50 np0005539504 nova_compute[187152]: 2025-11-29 07:24:50.315 187156 DEBUG oslo_concurrency.lockutils [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:50 np0005539504 nova_compute[187152]: 2025-11-29 07:24:50.315 187156 DEBUG nova.network.neutron [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:24:50 np0005539504 podman[236304]: 2025-11-29 07:24:50.738824073 +0000 UTC m=+0.071723743 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:24:50 np0005539504 podman[236305]: 2025-11-29 07:24:50.794655269 +0000 UTC m=+0.121144957 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller)
Nov 29 02:24:51 np0005539504 nova_compute[187152]: 2025-11-29 07:24:51.230 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:51 np0005539504 nova_compute[187152]: 2025-11-29 07:24:51.707 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:52 np0005539504 nova_compute[187152]: 2025-11-29 07:24:52.631 187156 DEBUG nova.compute.manager [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:52 np0005539504 nova_compute[187152]: 2025-11-29 07:24:52.632 187156 DEBUG oslo_concurrency.lockutils [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:52 np0005539504 nova_compute[187152]: 2025-11-29 07:24:52.632 187156 DEBUG oslo_concurrency.lockutils [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:52 np0005539504 nova_compute[187152]: 2025-11-29 07:24:52.633 187156 DEBUG oslo_concurrency.lockutils [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:52 np0005539504 nova_compute[187152]: 2025-11-29 07:24:52.633 187156 DEBUG nova.compute.manager [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:52 np0005539504 nova_compute[187152]: 2025-11-29 07:24:52.633 187156 WARNING nova.compute.manager [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:24:53 np0005539504 nova_compute[187152]: 2025-11-29 07:24:53.802 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Successfully created port: 117c88c8-f8df-49f6-aa22-1c554973f1ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:24:54 np0005539504 nova_compute[187152]: 2025-11-29 07:24:54.208 187156 DEBUG nova.network.neutron [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated VIF entry in instance network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:24:54 np0005539504 nova_compute[187152]: 2025-11-29 07:24:54.208 187156 DEBUG nova.network.neutron [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:54 np0005539504 nova_compute[187152]: 2025-11-29 07:24:54.555 187156 DEBUG oslo_concurrency.lockutils [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:55 np0005539504 nova_compute[187152]: 2025-11-29 07:24:55.528 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Successfully created port: ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.232 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.298 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401081.297163, 7c10cb24-586c-4507-8169-8258d7136397 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.299 187156 INFO nova.compute.manager [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.547 187156 DEBUG nova.compute.manager [None req-4140a3ae-c739-4e80-962d-8ac4b932a723 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.552 187156 DEBUG nova.compute.manager [None req-4140a3ae-c739-4e80-962d-8ac4b932a723 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.577 187156 INFO nova.compute.manager [None req-4140a3ae-c739-4e80-962d-8ac4b932a723 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Nov 29 02:24:56 np0005539504 nova_compute[187152]: 2025-11-29 07:24:56.709 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:57 np0005539504 podman[236353]: 2025-11-29 07:24:57.744364792 +0000 UTC m=+0.083187859 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 02:24:57 np0005539504 nova_compute[187152]: 2025-11-29 07:24:57.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.002 187156 DEBUG nova.compute.manager [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.002 187156 DEBUG oslo_concurrency.lockutils [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.003 187156 DEBUG oslo_concurrency.lockutils [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.003 187156 DEBUG oslo_concurrency.lockutils [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.003 187156 DEBUG nova.compute.manager [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.003 187156 WARNING nova.compute.manager [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:25:00 np0005539504 nova_compute[187152]: 2025-11-29 07:25:00.540 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Successfully updated port: 117c88c8-f8df-49f6-aa22-1c554973f1ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.234 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.355 187156 DEBUG nova.compute.manager [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-changed-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.355 187156 DEBUG nova.compute.manager [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing instance network info cache due to event network-changed-117c88c8-f8df-49f6-aa22-1c554973f1ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.356 187156 DEBUG oslo_concurrency.lockutils [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.356 187156 DEBUG oslo_concurrency.lockutils [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.356 187156 DEBUG nova.network.neutron [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing network info cache for port 117c88c8-f8df-49f6-aa22-1c554973f1ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.526 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.527 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.527 187156 DEBUG nova.compute.manager [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Going to confirm migration 19 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.570 187156 DEBUG nova.objects.instance [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'info_cache' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.653 187156 DEBUG nova.network.neutron [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:25:01 np0005539504 nova_compute[187152]: 2025-11-29 07:25:01.712 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.151 187156 DEBUG nova.compute.manager [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.151 187156 DEBUG oslo_concurrency.lockutils [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.151 187156 DEBUG oslo_concurrency.lockutils [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.151 187156 DEBUG oslo_concurrency.lockutils [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.152 187156 DEBUG nova.compute.manager [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.152 187156 WARNING nova.compute.manager [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.213 187156 DEBUG neutronclient.v2_0.client [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.214 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.215 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.215 187156 DEBUG nova.network.neutron [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.277 187156 DEBUG nova.network.neutron [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.300 187156 DEBUG oslo_concurrency.lockutils [req-865437af-d5c6-40f1-8fb4-a0f6f3e21928 req-9d75f082-653b-49ef-a892-afc07b96f230 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.445 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Successfully updated port: ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.473 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.474 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.474 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:02 np0005539504 nova_compute[187152]: 2025-11-29 07:25:02.773 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.492 187156 DEBUG nova.compute.manager [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-changed-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.492 187156 DEBUG nova.compute.manager [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing instance network info cache due to event network-changed-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.493 187156 DEBUG oslo_concurrency.lockutils [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.727 187156 DEBUG nova.network.neutron [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.751 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.752 187156 DEBUG nova.objects.instance [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.781 187156 DEBUG nova.virt.libvirt.vif [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:24:58Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.782 187156 DEBUG nova.network.os_vif_util [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.783 187156 DEBUG nova.network.os_vif_util [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.784 187156 DEBUG os_vif [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.787 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.791 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape89dd8de-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.792 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.796 187156 INFO os_vif [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9')#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.797 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.797 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.901 187156 DEBUG nova.compute.provider_tree [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.924 187156 DEBUG nova.scheduler.client.report [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:03 np0005539504 nova_compute[187152]: 2025-11-29 07:25:03.975 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.152 187156 INFO nova.scheduler.client.report [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocation for migration 1dc05e65-96f1-44d7-bfe8-4b2c41239656#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.236 187156 DEBUG oslo_concurrency.lockutils [None req-a79a8e47-fd5e-410f-a06e-00728b86ca08 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.855 187156 DEBUG nova.network.neutron [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.875 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.875 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance network_info: |[{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.876 187156 DEBUG oslo_concurrency.lockutils [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.876 187156 DEBUG nova.network.neutron [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing network info cache for port ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.880 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Start _get_guest_xml network_info=[{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.885 187156 WARNING nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.891 187156 DEBUG nova.virt.libvirt.host [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.892 187156 DEBUG nova.virt.libvirt.host [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.896 187156 DEBUG nova.virt.libvirt.host [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.897 187156 DEBUG nova.virt.libvirt.host [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.899 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.899 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.900 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.900 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.900 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.901 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.901 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.901 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.902 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.902 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.902 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.902 187156 DEBUG nova.virt.hardware [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.909 187156 DEBUG nova.virt.libvirt.vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-107191372',display_name='tempest-TestGettingAddress-server-107191372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-107191372',id=124,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-umysq0th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:44Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=a047dabb-8e55-4bea-92aa-20b191da7b54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.909 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.910 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.911 187156 DEBUG nova.virt.libvirt.vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-107191372',display_name='tempest-TestGettingAddress-server-107191372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-107191372',id=124,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-umysq0th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:44Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=a047dabb-8e55-4bea-92aa-20b191da7b54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.912 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.912 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.914 187156 DEBUG nova.objects.instance [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid a047dabb-8e55-4bea-92aa-20b191da7b54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.931 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <uuid>a047dabb-8e55-4bea-92aa-20b191da7b54</uuid>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <name>instance-0000007c</name>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestGettingAddress-server-107191372</nova:name>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:25:04</nova:creationTime>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:port uuid="117c88c8-f8df-49f6-aa22-1c554973f1ad">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        <nova:port uuid="ab95b3bf-94ed-4d6d-bf40-ce3672f08a71">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe0d:e3d" ipVersion="6"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <entry name="serial">a047dabb-8e55-4bea-92aa-20b191da7b54</entry>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <entry name="uuid">a047dabb-8e55-4bea-92aa-20b191da7b54</entry>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.config"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:6f:ae:fa"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <target dev="tap117c88c8-f8"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:0d:0e:3d"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <target dev="tapab95b3bf-94"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/console.log" append="off"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:25:04 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:25:04 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:25:04 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:25:04 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.933 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Preparing to wait for external event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.934 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.934 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.934 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.934 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Preparing to wait for external event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.935 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.935 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.935 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.936 187156 DEBUG nova.virt.libvirt.vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-107191372',display_name='tempest-TestGettingAddress-server-107191372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-107191372',id=124,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-umysq0th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:44Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=a047dabb-8e55-4bea-92aa-20b191da7b54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.937 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.937 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.938 187156 DEBUG os_vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.939 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.939 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.940 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.944 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.944 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap117c88c8-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.945 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap117c88c8-f8, col_values=(('external_ids', {'iface-id': '117c88c8-f8df-49f6-aa22-1c554973f1ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:ae:fa', 'vm-uuid': 'a047dabb-8e55-4bea-92aa-20b191da7b54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.947 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 NetworkManager[55210]: <info>  [1764401104.9488] manager: (tap117c88c8-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.950 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.959 187156 INFO os_vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8')#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.960 187156 DEBUG nova.virt.libvirt.vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-107191372',display_name='tempest-TestGettingAddress-server-107191372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-107191372',id=124,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-umysq0th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:44Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=a047dabb-8e55-4bea-92aa-20b191da7b54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.960 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.962 187156 DEBUG nova.network.os_vif_util [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.962 187156 DEBUG os_vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.963 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.964 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.964 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.967 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.967 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab95b3bf-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.968 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab95b3bf-94, col_values=(('external_ids', {'iface-id': 'ab95b3bf-94ed-4d6d-bf40-ce3672f08a71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:0e:3d', 'vm-uuid': 'a047dabb-8e55-4bea-92aa-20b191da7b54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.969 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 NetworkManager[55210]: <info>  [1764401104.9704] manager: (tapab95b3bf-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.972 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539504 nova_compute[187152]: 2025-11-29 07:25:04.981 187156 INFO os_vif [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94')#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.066 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.067 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.067 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:6f:ae:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.067 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:0d:0e:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.068 187156 INFO nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Using config drive#033[00m
Nov 29 02:25:05 np0005539504 podman[236378]: 2025-11-29 07:25:05.09020593 +0000 UTC m=+0.064747577 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.569 187156 INFO nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Creating config drive at /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.config#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.573 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzs9gwjl2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.700 187156 DEBUG oslo_concurrency.processutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzs9gwjl2" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:05 np0005539504 kernel: tap117c88c8-f8: entered promiscuous mode
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.7747] manager: (tap117c88c8-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00465|binding|INFO|Claiming lport 117c88c8-f8df-49f6-aa22-1c554973f1ad for this chassis.
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00466|binding|INFO|117c88c8-f8df-49f6-aa22-1c554973f1ad: Claiming fa:16:3e:6f:ae:fa 10.100.0.5
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.778 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.785 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:ae:fa 10.100.0.5'], port_security=['fa:16:3e:6f:ae:fa 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a047dabb-8e55-4bea-92aa-20b191da7b54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5c6bb94-c536-451b-a4cb-db984bf0cbdf, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=117c88c8-f8df-49f6-aa22-1c554973f1ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.786 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 117c88c8-f8df-49f6-aa22-1c554973f1ad in datapath ae86c83f-be5a-4cd0-9064-11898ee2fcef bound to our chassis#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.788 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae86c83f-be5a-4cd0-9064-11898ee2fcef#033[00m
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00467|binding|INFO|Setting lport 117c88c8-f8df-49f6-aa22-1c554973f1ad ovn-installed in OVS
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00468|binding|INFO|Setting lport 117c88c8-f8df-49f6-aa22-1c554973f1ad up in Southbound
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:05 np0005539504 kernel: tapab95b3bf-94: entered promiscuous mode
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.804 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.803 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9e364844-fbd6-4d7d-9c29-2b2194fdf558]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.807 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae86c83f-b1 in ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.8087] manager: (tapab95b3bf-94): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.807 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00469|binding|INFO|Claiming lport ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 for this chassis.
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00470|binding|INFO|ab95b3bf-94ed-4d6d-bf40-ce3672f08a71: Claiming fa:16:3e:0d:0e:3d 2001:db8::f816:3eff:fe0d:e3d
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.815 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:0e:3d 2001:db8::f816:3eff:fe0d:e3d'], port_security=['fa:16:3e:0d:0e:3d 2001:db8::f816:3eff:fe0d:e3d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe0d:e3d/64', 'neutron:device_id': 'a047dabb-8e55-4bea-92aa-20b191da7b54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3d94aff-5439-43d3-a356-7aafae582344', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=890f979e-778b-42a4-aff1-be3795cfb05f, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.819 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae86c83f-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.819 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[80f6e8f9-f33e-4907-9b4b-3c283c584547]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 systemd-udevd[236419]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00471|binding|INFO|Setting lport ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 ovn-installed in OVS
Nov 29 02:25:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:05Z|00472|binding|INFO|Setting lport ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 up in Southbound
Nov 29 02:25:05 np0005539504 systemd-udevd[236420]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.821 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[53bb7ee4-bc9a-4749-b48a-14c1bcc7deba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.821 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.8325] device (tapab95b3bf-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.8337] device (tap117c88c8-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.8347] device (tapab95b3bf-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.8352] device (tap117c88c8-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.835 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[db82da69-4b82-4e23-b25e-e7f1478a33c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 systemd-machined[153423]: New machine qemu-62-instance-0000007c.
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.850 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ff20b666-f8d3-4ddb-9415-7483423a0422]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 systemd[1]: Started Virtual Machine qemu-62-instance-0000007c.
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.886 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[51a22059-8bc7-417c-9e34-20be66ebe1f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.8949] manager: (tapae86c83f-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/216)
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.894 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[02c6dc4f-e71f-4905-a07e-54c95aaac1ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.931 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[e7905130-7ff7-4240-a70b-8c6e1c65882a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 nova_compute[187152]: 2025-11-29 07:25:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.936 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[92e797bd-4a2c-47e9-9549-0b355a5eba58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 NetworkManager[55210]: <info>  [1764401105.9665] device (tapae86c83f-b0): carrier: link connected
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.971 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[cdfb7ebe-3a13-4252-ab29-5521c63ec41e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:05.989 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d73fddea-f1bc-4619-bc9c-7ed230a55eae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae86c83f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2f:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656884, 'reachable_time': 18752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236455, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.007 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[674dc88e-bf3c-468a-aa2d-4f92f5b1b35b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:2fe5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 656884, 'tstamp': 656884}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236456, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.027 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97a00450-5375-4cd3-9551-ff1b1bf70c5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae86c83f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2f:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656884, 'reachable_time': 18752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236457, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.059 187156 DEBUG nova.compute.manager [req-b6a73392-11e0-4fb9-8cd6-64f2aece0d4e req-3da273bb-8cbf-45dd-a455-4d557b0a280c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.061 187156 DEBUG oslo_concurrency.lockutils [req-b6a73392-11e0-4fb9-8cd6-64f2aece0d4e req-3da273bb-8cbf-45dd-a455-4d557b0a280c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.061 187156 DEBUG oslo_concurrency.lockutils [req-b6a73392-11e0-4fb9-8cd6-64f2aece0d4e req-3da273bb-8cbf-45dd-a455-4d557b0a280c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.062 187156 DEBUG oslo_concurrency.lockutils [req-b6a73392-11e0-4fb9-8cd6-64f2aece0d4e req-3da273bb-8cbf-45dd-a455-4d557b0a280c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.063 187156 DEBUG nova.compute.manager [req-b6a73392-11e0-4fb9-8cd6-64f2aece0d4e req-3da273bb-8cbf-45dd-a455-4d557b0a280c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Processing event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.068 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d6629f0c-77eb-47c8-b704-0df8c021c8c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.079 187156 DEBUG nova.compute.manager [req-39377212-18eb-4c1c-bb7b-1574511f4771 req-69005051-16bb-4ab1-817c-899077e5fd34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.080 187156 DEBUG oslo_concurrency.lockutils [req-39377212-18eb-4c1c-bb7b-1574511f4771 req-69005051-16bb-4ab1-817c-899077e5fd34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.080 187156 DEBUG oslo_concurrency.lockutils [req-39377212-18eb-4c1c-bb7b-1574511f4771 req-69005051-16bb-4ab1-817c-899077e5fd34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.081 187156 DEBUG oslo_concurrency.lockutils [req-39377212-18eb-4c1c-bb7b-1574511f4771 req-69005051-16bb-4ab1-817c-899077e5fd34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.081 187156 DEBUG nova.compute.manager [req-39377212-18eb-4c1c-bb7b-1574511f4771 req-69005051-16bb-4ab1-817c-899077e5fd34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Processing event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.147 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3df96e4e-647c-4137-88dc-5097b16d5701]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.149 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae86c83f-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.149 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.150 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae86c83f-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.151 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:06 np0005539504 NetworkManager[55210]: <info>  [1764401106.1527] manager: (tapae86c83f-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Nov 29 02:25:06 np0005539504 kernel: tapae86c83f-b0: entered promiscuous mode
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.154 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.155 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae86c83f-b0, col_values=(('external_ids', {'iface-id': 'e3e5d9ef-c03b-4d54-8f92-11c237a85862'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:06Z|00473|binding|INFO|Releasing lport e3e5d9ef-c03b-4d54-8f92-11c237a85862 from this chassis (sb_readonly=0)
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.158 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.158 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae86c83f-be5a-4cd0-9064-11898ee2fcef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae86c83f-be5a-4cd0-9064-11898ee2fcef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.159 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50755bfb-4a9a-40bf-b287-dce38b6a214b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.160 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-ae86c83f-be5a-4cd0-9064-11898ee2fcef
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/ae86c83f-be5a-4cd0-9064-11898ee2fcef.pid.haproxy
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID ae86c83f-be5a-4cd0-9064-11898ee2fcef
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.161 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'env', 'PROCESS_TAG=haproxy-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae86c83f-be5a-4cd0-9064-11898ee2fcef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.170 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.234 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.466 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401106.4659371, a047dabb-8e55-4bea-92aa-20b191da7b54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.467 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] VM Started (Lifecycle Event)#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.470 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.474 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.477 187156 INFO nova.virt.libvirt.driver [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance spawned successfully.#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.477 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.516 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.520 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.535 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.536 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.537 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.538 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:06 np0005539504 podman[236495]: 2025-11-29 07:25:06.539687734 +0000 UTC m=+0.055992751 container create 0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.539 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.541 187156 DEBUG nova.virt.libvirt.driver [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.548 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.549 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401106.467468, a047dabb-8e55-4bea-92aa-20b191da7b54 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.550 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:25:06 np0005539504 systemd[1]: Started libpod-conmon-0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23.scope.
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.586 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.593 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401106.473921, a047dabb-8e55-4bea-92aa-20b191da7b54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.594 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:25:06 np0005539504 podman[236495]: 2025-11-29 07:25:06.507270206 +0000 UTC m=+0.023575253 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.606 187156 DEBUG nova.network.neutron [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updated VIF entry in instance network info cache for port ab95b3bf-94ed-4d6d-bf40-ce3672f08a71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.607 187156 DEBUG nova.network.neutron [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:06 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:25:06 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d27864e328470048392ae6b197bb2117a006b53ebef2ee204cd193f0ca4074a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.626 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.627 187156 DEBUG oslo_concurrency.lockutils [req-a745b33b-d6fc-4842-a82d-4a44b95afa92 req-7932c206-5332-47c8-a5f4-05f60cb551d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.630 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:06 np0005539504 podman[236495]: 2025-11-29 07:25:06.635389238 +0000 UTC m=+0.151694255 container init 0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:25:06 np0005539504 podman[236495]: 2025-11-29 07:25:06.645208761 +0000 UTC m=+0.161513778 container start 0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.655 187156 INFO nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Took 22.25 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.655 187156 DEBUG nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.657 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:25:06 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [NOTICE]   (236514) : New worker (236516) forked
Nov 29 02:25:06 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [NOTICE]   (236514) : Loading success.
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.722 104164 INFO neutron.agent.ovn.metadata.agent [-] Port ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 in datapath a3d94aff-5439-43d3-a356-7aafae582344 unbound from our chassis#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.725 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3d94aff-5439-43d3-a356-7aafae582344#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.735 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[48991119-48c9-46e5-9883-c7f9a6aad218]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.736 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3d94aff-51 in ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.738 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3d94aff-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.738 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96fba85b-44a4-49b5-9c3a-8320d09f8046]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.739 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3b12704c-64de-4f3a-972f-e242da0b8b95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.750 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[dc269a43-ee89-4faa-bb20-c92bcac7160a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.759 187156 INFO nova.compute.manager [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Took 22.97 seconds to build instance.#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.774 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c19191b4-000a-41fb-9d2a-4575afb24a26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 nova_compute[187152]: 2025-11-29 07:25:06.779 187156 DEBUG oslo_concurrency.lockutils [None req-c0f73859-de3c-43ef-8144-f5c83db67e4f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.806 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9de6bf-ef8a-4691-a52d-41342068ba8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.810 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[22505ec3-7fe3-4bae-b166-1a11936fd421]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 NetworkManager[55210]: <info>  [1764401106.8116] manager: (tapa3d94aff-50): new Veth device (/org/freedesktop/NetworkManager/Devices/218)
Nov 29 02:25:06 np0005539504 systemd-udevd[236446]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.843 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4f982ee3-bd33-4ee0-aed0-02e420b39e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.846 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[41585e8e-3985-4c0b-a3cb-5ddcdd3d8636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 NetworkManager[55210]: <info>  [1764401106.8715] device (tapa3d94aff-50): carrier: link connected
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.877 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[967ab2bf-0a35-4471-b555-eb334cb7ff4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.894 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c31842-79af-44a9-bdd5-aa9f0df01dea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3d94aff-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:4f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656975, 'reachable_time': 35273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236535, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.919 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e7776b99-63c2-4c88-a9e4-c52c82a4c115]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:4fda'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 656975, 'tstamp': 656975}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236536, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.935 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[656c63ef-b2ae-4755-9e1a-4a379f54735b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3d94aff-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:4f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656975, 'reachable_time': 35273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236537, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:06.970 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba716b6-db4a-4731-b95b-7d4932130c91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.010 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[243c84b7-e7dd-41ac-816c-02982f1290e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.012 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3d94aff-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.012 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.012 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3d94aff-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:07 np0005539504 NetworkManager[55210]: <info>  [1764401107.0153] manager: (tapa3d94aff-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Nov 29 02:25:07 np0005539504 nova_compute[187152]: 2025-11-29 07:25:07.014 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:07 np0005539504 kernel: tapa3d94aff-50: entered promiscuous mode
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.020 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3d94aff-50, col_values=(('external_ids', {'iface-id': '06f4ec62-8b16-4a76-9398-b2117639cd20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:07 np0005539504 nova_compute[187152]: 2025-11-29 07:25:07.019 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:07Z|00474|binding|INFO|Releasing lport 06f4ec62-8b16-4a76-9398-b2117639cd20 from this chassis (sb_readonly=0)
Nov 29 02:25:07 np0005539504 nova_compute[187152]: 2025-11-29 07:25:07.022 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:07 np0005539504 nova_compute[187152]: 2025-11-29 07:25:07.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:07 np0005539504 nova_compute[187152]: 2025-11-29 07:25:07.041 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.041 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3d94aff-5439-43d3-a356-7aafae582344.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3d94aff-5439-43d3-a356-7aafae582344.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.042 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[929c159b-a266-4b79-a5e6-ff24a6f3aa65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.043 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-a3d94aff-5439-43d3-a356-7aafae582344
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/a3d94aff-5439-43d3-a356-7aafae582344.pid.haproxy
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID a3d94aff-5439-43d3-a356-7aafae582344
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:07.043 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'env', 'PROCESS_TAG=haproxy-a3d94aff-5439-43d3-a356-7aafae582344', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3d94aff-5439-43d3-a356-7aafae582344.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:07 np0005539504 podman[236565]: 2025-11-29 07:25:07.404652588 +0000 UTC m=+0.056362422 container create a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:25:07 np0005539504 systemd[1]: Started libpod-conmon-a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da.scope.
Nov 29 02:25:07 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:25:07 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38cfa709225e42a9452ba85f8dd7070023ccfa6d1e86e17544f3e6cf9f15ed91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:25:07 np0005539504 podman[236565]: 2025-11-29 07:25:07.473056001 +0000 UTC m=+0.124765845 container init a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:25:07 np0005539504 podman[236565]: 2025-11-29 07:25:07.380405678 +0000 UTC m=+0.032115562 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:25:07 np0005539504 podman[236565]: 2025-11-29 07:25:07.480591702 +0000 UTC m=+0.132301536 container start a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:25:07 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [NOTICE]   (236585) : New worker (236587) forked
Nov 29 02:25:07 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [NOTICE]   (236585) : Loading success.
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.165 187156 DEBUG nova.compute.manager [req-94310a06-78c5-4acd-a76b-ceae288d83b1 req-1d61ddf0-a66f-4db5-9de9-d6fa2fc2860e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.166 187156 DEBUG oslo_concurrency.lockutils [req-94310a06-78c5-4acd-a76b-ceae288d83b1 req-1d61ddf0-a66f-4db5-9de9-d6fa2fc2860e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.167 187156 DEBUG oslo_concurrency.lockutils [req-94310a06-78c5-4acd-a76b-ceae288d83b1 req-1d61ddf0-a66f-4db5-9de9-d6fa2fc2860e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.167 187156 DEBUG oslo_concurrency.lockutils [req-94310a06-78c5-4acd-a76b-ceae288d83b1 req-1d61ddf0-a66f-4db5-9de9-d6fa2fc2860e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.167 187156 DEBUG nova.compute.manager [req-94310a06-78c5-4acd-a76b-ceae288d83b1 req-1d61ddf0-a66f-4db5-9de9-d6fa2fc2860e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] No waiting events found dispatching network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.168 187156 WARNING nova.compute.manager [req-94310a06-78c5-4acd-a76b-ceae288d83b1 req-1d61ddf0-a66f-4db5-9de9-d6fa2fc2860e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received unexpected event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.254 187156 DEBUG nova.compute.manager [req-5b39fea3-50ad-43ee-a50e-614c679fc510 req-e77b4757-ac25-4f1f-84ea-6f8727431cb9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.254 187156 DEBUG oslo_concurrency.lockutils [req-5b39fea3-50ad-43ee-a50e-614c679fc510 req-e77b4757-ac25-4f1f-84ea-6f8727431cb9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.254 187156 DEBUG oslo_concurrency.lockutils [req-5b39fea3-50ad-43ee-a50e-614c679fc510 req-e77b4757-ac25-4f1f-84ea-6f8727431cb9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.254 187156 DEBUG oslo_concurrency.lockutils [req-5b39fea3-50ad-43ee-a50e-614c679fc510 req-e77b4757-ac25-4f1f-84ea-6f8727431cb9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.255 187156 DEBUG nova.compute.manager [req-5b39fea3-50ad-43ee-a50e-614c679fc510 req-e77b4757-ac25-4f1f-84ea-6f8727431cb9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] No waiting events found dispatching network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.255 187156 WARNING nova.compute.manager [req-5b39fea3-50ad-43ee-a50e-614c679fc510 req-e77b4757-ac25-4f1f-84ea-6f8727431cb9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received unexpected event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad for instance with vm_state active and task_state None.#033[00m
Nov 29 02:25:08 np0005539504 nova_compute[187152]: 2025-11-29 07:25:08.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:09 np0005539504 nova_compute[187152]: 2025-11-29 07:25:09.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.608 187156 DEBUG nova.compute.manager [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-changed-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.609 187156 DEBUG nova.compute.manager [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing instance network info cache due to event network-changed-117c88c8-f8df-49f6-aa22-1c554973f1ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.609 187156 DEBUG oslo_concurrency.lockutils [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.609 187156 DEBUG oslo_concurrency.lockutils [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.609 187156 DEBUG nova.network.neutron [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing network info cache for port 117c88c8-f8df-49f6-aa22-1c554973f1ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:10 np0005539504 nova_compute[187152]: 2025-11-29 07:25:10.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:25:11 np0005539504 nova_compute[187152]: 2025-11-29 07:25:11.237 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:11 np0005539504 nova_compute[187152]: 2025-11-29 07:25:11.900 187156 DEBUG nova.network.neutron [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updated VIF entry in instance network info cache for port 117c88c8-f8df-49f6-aa22-1c554973f1ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:25:11 np0005539504 nova_compute[187152]: 2025-11-29 07:25:11.901 187156 DEBUG nova.network.neutron [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:11 np0005539504 nova_compute[187152]: 2025-11-29 07:25:11.923 187156 DEBUG oslo_concurrency.lockutils [req-83e7c9be-6d37-4133-8b00-814c21061020 req-6b26cb7c-31eb-4bfb-a05e-cb761a66ffea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:12 np0005539504 nova_compute[187152]: 2025-11-29 07:25:12.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.546 187156 DEBUG nova.compute.manager [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.607 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.608 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.635 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.669 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.670 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.701 187156 DEBUG nova.objects.instance [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.717 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.718 187156 INFO nova.compute.claims [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.718 187156 DEBUG nova.objects.instance [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.738 187156 DEBUG nova.objects.instance [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.746 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.839 187156 INFO nova.compute.resource_tracker [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating resource usage from migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.843 187156 DEBUG nova.compute.resource_tracker [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Starting to track incoming migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.973 187156 DEBUG nova.compute.provider_tree [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.989 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:14 np0005539504 nova_compute[187152]: 2025-11-29 07:25:14.998 187156 DEBUG nova.scheduler.client.report [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.024 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.025 187156 INFO nova.compute.manager [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Migrating#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.033 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.040 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.041 187156 INFO nova.compute.claims [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.243 187156 DEBUG nova.compute.provider_tree [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.259 187156 DEBUG nova.scheduler.client.report [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.279 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.280 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.282 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.283 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.283 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.375 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.377 187156 DEBUG nova.network.neutron [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.392 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.420 187156 INFO nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.444 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.455 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.457 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.514 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.554 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.557 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.558 187156 INFO nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Creating image(s)#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.560 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.560 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.561 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.579 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.603 187156 DEBUG nova.policy [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.637 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.638 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.638 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.649 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.707 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.708 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.750 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.751 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.752 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.807 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.809 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5493MB free_disk=73.19078063964844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.809 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.810 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.811 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.811 187156 DEBUG nova.virt.disk.api [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Checking if we can resize image /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.812 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.888 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Migration for instance 3b61bf63-8328-4d31-93e5-0a19ca27cd63 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.890 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.891 187156 DEBUG nova.virt.disk.api [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Cannot resize image /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.892 187156 DEBUG nova.objects.instance [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.908 187156 INFO nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating resource usage from migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.909 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Starting to track incoming migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.913 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.914 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Ensure instance console log exists: /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.914 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.915 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.915 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.933 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance a047dabb-8e55-4bea-92aa-20b191da7b54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.934 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 220b7865-2248-43ba-865a-b2314b5a6e47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.949 187156 WARNING nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 3b61bf63-8328-4d31-93e5-0a19ca27cd63 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.950 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:25:15 np0005539504 nova_compute[187152]: 2025-11-29 07:25:15.951 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:25:16 np0005539504 nova_compute[187152]: 2025-11-29 07:25:16.026 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:16 np0005539504 nova_compute[187152]: 2025-11-29 07:25:16.045 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:16 np0005539504 nova_compute[187152]: 2025-11-29 07:25:16.064 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:25:16 np0005539504 nova_compute[187152]: 2025-11-29 07:25:16.065 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:16 np0005539504 nova_compute[187152]: 2025-11-29 07:25:16.239 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:16 np0005539504 podman[236621]: 2025-11-29 07:25:16.745468195 +0000 UTC m=+0.063056410 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:25:16 np0005539504 podman[236623]: 2025-11-29 07:25:16.774502653 +0000 UTC m=+0.093463675 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 02:25:16 np0005539504 podman[236622]: 2025-11-29 07:25:16.785769415 +0000 UTC m=+0.103282098 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.063 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.065 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.065 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.085 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.380 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.381 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.381 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.382 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid a047dabb-8e55-4bea-92aa-20b191da7b54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:17 np0005539504 nova_compute[187152]: 2025-11-29 07:25:17.492 187156 DEBUG nova.network.neutron [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Successfully created port: 86264ec7-05bf-4512-ac97-016779ba241a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:25:18 np0005539504 systemd-logind[783]: New session 59 of user nova.
Nov 29 02:25:18 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:25:18 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:25:18 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:25:18 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:25:18 np0005539504 systemd[236689]: Queued start job for default target Main User Target.
Nov 29 02:25:18 np0005539504 systemd[236689]: Created slice User Application Slice.
Nov 29 02:25:18 np0005539504 systemd[236689]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:25:18 np0005539504 systemd[236689]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:25:18 np0005539504 systemd[236689]: Reached target Paths.
Nov 29 02:25:18 np0005539504 systemd[236689]: Reached target Timers.
Nov 29 02:25:18 np0005539504 systemd[236689]: Starting D-Bus User Message Bus Socket...
Nov 29 02:25:18 np0005539504 systemd[236689]: Starting Create User's Volatile Files and Directories...
Nov 29 02:25:18 np0005539504 systemd[236689]: Finished Create User's Volatile Files and Directories.
Nov 29 02:25:18 np0005539504 systemd[236689]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:25:18 np0005539504 systemd[236689]: Reached target Sockets.
Nov 29 02:25:18 np0005539504 systemd[236689]: Reached target Basic System.
Nov 29 02:25:18 np0005539504 systemd[236689]: Reached target Main User Target.
Nov 29 02:25:18 np0005539504 systemd[236689]: Startup finished in 139ms.
Nov 29 02:25:18 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:25:18 np0005539504 systemd[1]: Started Session 59 of User nova.
Nov 29 02:25:18 np0005539504 systemd[1]: session-59.scope: Deactivated successfully.
Nov 29 02:25:18 np0005539504 systemd-logind[783]: Session 59 logged out. Waiting for processes to exit.
Nov 29 02:25:18 np0005539504 systemd-logind[783]: Removed session 59.
Nov 29 02:25:18 np0005539504 systemd-logind[783]: New session 61 of user nova.
Nov 29 02:25:18 np0005539504 systemd[1]: Started Session 61 of User nova.
Nov 29 02:25:18 np0005539504 systemd[1]: session-61.scope: Deactivated successfully.
Nov 29 02:25:18 np0005539504 systemd-logind[783]: Session 61 logged out. Waiting for processes to exit.
Nov 29 02:25:18 np0005539504 systemd-logind[783]: Removed session 61.
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.306 187156 DEBUG nova.network.neutron [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Successfully updated port: 86264ec7-05bf-4512-ac97-016779ba241a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.334 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.335 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.335 187156 DEBUG nova.network.neutron [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.423 187156 DEBUG nova.compute.manager [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-changed-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.423 187156 DEBUG nova.compute.manager [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Refreshing instance network info cache due to event network-changed-86264ec7-05bf-4512-ac97-016779ba241a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.423 187156 DEBUG oslo_concurrency.lockutils [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.550 187156 DEBUG nova.network.neutron [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:25:19 np0005539504 nova_compute[187152]: 2025-11-29 07:25:19.982 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:20Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6f:ae:fa 10.100.0.5
Nov 29 02:25:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:20Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:ae:fa 10.100.0.5
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.955 187156 DEBUG nova.network.neutron [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updating instance_info_cache with network_info: [{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.971 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.972 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance network_info: |[{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.972 187156 DEBUG oslo_concurrency.lockutils [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.972 187156 DEBUG nova.network.neutron [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Refreshing network info cache for port 86264ec7-05bf-4512-ac97-016779ba241a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.976 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Start _get_guest_xml network_info=[{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.981 187156 WARNING nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.985 187156 DEBUG nova.virt.libvirt.host [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.986 187156 DEBUG nova.virt.libvirt.host [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.993 187156 DEBUG nova.virt.libvirt.host [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.994 187156 DEBUG nova.virt.libvirt.host [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.995 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.995 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.996 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.996 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.997 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.997 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.997 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.997 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.998 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.998 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.998 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:25:20 np0005539504 nova_compute[187152]: 2025-11-29 07:25:20.999 187156 DEBUG nova.virt.hardware [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.003 187156 DEBUG nova.virt.libvirt.vif [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1803543286',display_name='tempest-ServerStableDeviceRescueTest-server-1803543286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1803543286',id=126,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-9i8if1ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:15Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=220b7865-2248-43ba-865a-b2314b5a6e47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.003 187156 DEBUG nova.network.os_vif_util [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.004 187156 DEBUG nova.network.os_vif_util [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.005 187156 DEBUG nova.objects.instance [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.021 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.027 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <uuid>220b7865-2248-43ba-865a-b2314b5a6e47</uuid>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <name>instance-0000007e</name>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1803543286</nova:name>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:25:20</nova:creationTime>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:user uuid="5be41a8530314f83bbecbb74b9276f2d">tempest-ServerStableDeviceRescueTest-2012111838-project-member</nova:user>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:project uuid="ac3bb322fa744e099b38e08abe12d0e2">tempest-ServerStableDeviceRescueTest-2012111838</nova:project>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        <nova:port uuid="86264ec7-05bf-4512-ac97-016779ba241a">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <entry name="serial">220b7865-2248-43ba-865a-b2314b5a6e47</entry>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <entry name="uuid">220b7865-2248-43ba-865a-b2314b5a6e47</entry>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:03:25:aa"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <target dev="tap86264ec7-05"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/console.log" append="off"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:25:21 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:25:21 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:25:21 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:25:21 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.028 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Preparing to wait for external event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.028 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.029 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.029 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.030 187156 DEBUG nova.virt.libvirt.vif [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1803543286',display_name='tempest-ServerStableDeviceRescueTest-server-1803543286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1803543286',id=126,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-9i8if1ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:15Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=220b7865-2248-43ba-865a-b2314b5a6e47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.030 187156 DEBUG nova.network.os_vif_util [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.031 187156 DEBUG nova.network.os_vif_util [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.031 187156 DEBUG os_vif [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.032 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.032 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.033 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.038 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.038 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86264ec7-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.039 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86264ec7-05, col_values=(('external_ids', {'iface-id': '86264ec7-05bf-4512-ac97-016779ba241a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:25:aa', 'vm-uuid': '220b7865-2248-43ba-865a-b2314b5a6e47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.040 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:21 np0005539504 NetworkManager[55210]: <info>  [1764401121.0419] manager: (tap86264ec7-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.045 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.050 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.052 187156 INFO os_vif [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05')#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.242 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.601 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.602 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.604 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.605 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:21 np0005539504 podman[236727]: 2025-11-29 07:25:21.777487631 +0000 UTC m=+0.102050775 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:25:21 np0005539504 podman[236728]: 2025-11-29 07:25:21.808402719 +0000 UTC m=+0.121411694 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.969 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.970 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.970 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No VIF found with MAC fa:16:3e:03:25:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:25:21 np0005539504 nova_compute[187152]: 2025-11-29 07:25:21.971 187156 INFO nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Using config drive#033[00m
Nov 29 02:25:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:22.965 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:22.966 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:22.967 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.057 187156 INFO nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Creating config drive at /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.062 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpat49v0x7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.094 187156 DEBUG nova.network.neutron [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updated VIF entry in instance network info cache for port 86264ec7-05bf-4512-ac97-016779ba241a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.095 187156 DEBUG nova.network.neutron [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updating instance_info_cache with network_info: [{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.114 187156 DEBUG oslo_concurrency.lockutils [req-3b118995-9492-4d8f-b57e-0e18c69d41a9 req-74801180-882a-44ce-91bc-4c25220119ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.187 187156 DEBUG oslo_concurrency.processutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpat49v0x7" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:23 np0005539504 kernel: tap86264ec7-05: entered promiscuous mode
Nov 29 02:25:23 np0005539504 NetworkManager[55210]: <info>  [1764401123.2630] manager: (tap86264ec7-05): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.307 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:23Z|00475|binding|INFO|Claiming lport 86264ec7-05bf-4512-ac97-016779ba241a for this chassis.
Nov 29 02:25:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:23Z|00476|binding|INFO|86264ec7-05bf-4512-ac97-016779ba241a: Claiming fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:25:23 np0005539504 systemd-udevd[236792]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:23Z|00477|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a ovn-installed in OVS
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:23 np0005539504 NetworkManager[55210]: <info>  [1764401123.3377] device (tap86264ec7-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:23 np0005539504 NetworkManager[55210]: <info>  [1764401123.3387] device (tap86264ec7-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:23 np0005539504 systemd-machined[153423]: New machine qemu-63-instance-0000007e.
Nov 29 02:25:23 np0005539504 systemd[1]: Started Virtual Machine qemu-63-instance-0000007e.
Nov 29 02:25:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:23Z|00478|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a up in Southbound
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.585 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:25:aa 10.100.0.4'], port_security=['fa:16:3e:03:25:aa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=86264ec7-05bf-4512-ac97-016779ba241a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.586 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 86264ec7-05bf-4512-ac97-016779ba241a in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.589 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.602 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b32a5bc1-1c69-473c-a7b2-ed064aac3a4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.603 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap240f16d8-61 in ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.605 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap240f16d8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.606 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f98cfa0e-7def-4575-b263-de96f8e848d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.607 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[045d03f1-5a49-444a-8240-6e56663f566a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.618 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[ab00f3bd-d988-40d4-b0a1-89edaa62692b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.636 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3295f75c-8846-439b-b7ea-ec233dc42fa4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.671 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[378c89ec-a2da-4040-8bf2-fea651ac2bc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.677 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[54f37196-8e70-460e-9d72-62621aa402b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 systemd-udevd[236797]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:23 np0005539504 NetworkManager[55210]: <info>  [1764401123.6792] manager: (tap240f16d8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/222)
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.708 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[85e2bdb4-5287-4d03-a6bc-6464f9b2b70d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.711 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[bed1cab1-dd69-4ee4-b62f-17ebf911f685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 NetworkManager[55210]: <info>  [1764401123.7326] device (tap240f16d8-60): carrier: link connected
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.734 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3429609b-18fd-4923-8b6b-34895245abe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.756 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4a3a5c-4017-437f-a018-88ed4e7be997]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658660, 'reachable_time': 26636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236828, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.774 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8da5d209-95aa-4ac7-a5fb-2e68b8be32e9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:7e40'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 658660, 'tstamp': 658660}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236829, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.788 187156 DEBUG nova.compute.manager [req-ea98521e-174f-49b5-8cb7-16008e5fe700 req-b30e211e-20eb-4a11-8dd1-affea693d990 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.788 187156 DEBUG oslo_concurrency.lockutils [req-ea98521e-174f-49b5-8cb7-16008e5fe700 req-b30e211e-20eb-4a11-8dd1-affea693d990 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.788 187156 DEBUG oslo_concurrency.lockutils [req-ea98521e-174f-49b5-8cb7-16008e5fe700 req-b30e211e-20eb-4a11-8dd1-affea693d990 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.789 187156 DEBUG oslo_concurrency.lockutils [req-ea98521e-174f-49b5-8cb7-16008e5fe700 req-b30e211e-20eb-4a11-8dd1-affea693d990 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.789 187156 DEBUG nova.compute.manager [req-ea98521e-174f-49b5-8cb7-16008e5fe700 req-b30e211e-20eb-4a11-8dd1-affea693d990 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Processing event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.800 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[03e60f6e-c691-4da3-8cc5-4c43b74f647b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658660, 'reachable_time': 26636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236830, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.834 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[29ce91f6-cb25-4ea7-a301-4f809c57e63f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.903 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6a459de9-656f-449b-a034-cd9ace5a46dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.904 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.907 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:23 np0005539504 kernel: tap240f16d8-60: entered promiscuous mode
Nov 29 02:25:23 np0005539504 NetworkManager[55210]: <info>  [1764401123.9096] manager: (tap240f16d8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.911 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.912 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:23 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:23Z|00479|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.925 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.926 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[291857bc-bd0a-4c7b-8bbd-b8613fbde958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.927 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:23.928 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'env', 'PROCESS_TAG=haproxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/240f16d8-602b-4aa1-8edb-e3a8d3674e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.996 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401123.9956992, 220b7865-2248-43ba-865a-b2314b5a6e47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:23 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.996 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Started (Lifecycle Event)#033[00m
Nov 29 02:25:24 np0005539504 nova_compute[187152]: 2025-11-29 07:25:23.999 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:25:24 np0005539504 nova_compute[187152]: 2025-11-29 07:25:24.003 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:25:24 np0005539504 nova_compute[187152]: 2025-11-29 07:25:24.006 187156 INFO nova.virt.libvirt.driver [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance spawned successfully.#033[00m
Nov 29 02:25:24 np0005539504 nova_compute[187152]: 2025-11-29 07:25:24.006 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:25:24 np0005539504 podman[236866]: 2025-11-29 07:25:24.35212568 +0000 UTC m=+0.063352758 container create 8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:25:24 np0005539504 systemd[1]: Started libpod-conmon-8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39.scope.
Nov 29 02:25:24 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:25:24 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a907aa0be360b72e7cf4f600814da2712ee6184d4bfd8e90604ff41a69dd664/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:25:24 np0005539504 podman[236866]: 2025-11-29 07:25:24.321802527 +0000 UTC m=+0.033029615 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:25:24 np0005539504 podman[236866]: 2025-11-29 07:25:24.428117645 +0000 UTC m=+0.139344733 container init 8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:25:24 np0005539504 podman[236866]: 2025-11-29 07:25:24.433065388 +0000 UTC m=+0.144292456 container start 8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:25:24 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [NOTICE]   (236885) : New worker (236887) forked
Nov 29 02:25:24 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [NOTICE]   (236885) : Loading success.
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.309 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.317 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.322 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.322 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.323 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.323 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.324 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.324 187156 DEBUG nova.virt.libvirt.driver [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.354 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.354 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401123.9959784, 220b7865-2248-43ba-865a-b2314b5a6e47 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.355 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.378 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.382 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401124.0017936, 220b7865-2248-43ba-865a-b2314b5a6e47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.382 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.407 187156 INFO nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Took 9.85 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.408 187156 DEBUG nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.410 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.418 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.456 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.509 187156 INFO nova.compute.manager [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Took 10.80 seconds to build instance.#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.542 187156 DEBUG oslo_concurrency.lockutils [None req-960402ed-e082-447f-bb82-4929a563398e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.871 187156 DEBUG nova.compute.manager [req-e6c3f034-eb10-4439-b657-ee082cccc705 req-e89b888c-d3d6-434c-9a58-e575505c2bd2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.873 187156 DEBUG oslo_concurrency.lockutils [req-e6c3f034-eb10-4439-b657-ee082cccc705 req-e89b888c-d3d6-434c-9a58-e575505c2bd2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.873 187156 DEBUG oslo_concurrency.lockutils [req-e6c3f034-eb10-4439-b657-ee082cccc705 req-e89b888c-d3d6-434c-9a58-e575505c2bd2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.873 187156 DEBUG oslo_concurrency.lockutils [req-e6c3f034-eb10-4439-b657-ee082cccc705 req-e89b888c-d3d6-434c-9a58-e575505c2bd2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.873 187156 DEBUG nova.compute.manager [req-e6c3f034-eb10-4439-b657-ee082cccc705 req-e89b888c-d3d6-434c-9a58-e575505c2bd2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:25 np0005539504 nova_compute[187152]: 2025-11-29 07:25:25.873 187156 WARNING nova.compute.manager [req-e6c3f034-eb10-4439-b657-ee082cccc705 req-e89b888c-d3d6-434c-9a58-e575505c2bd2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state None.#033[00m
Nov 29 02:25:26 np0005539504 nova_compute[187152]: 2025-11-29 07:25:26.041 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:26 np0005539504 nova_compute[187152]: 2025-11-29 07:25:26.249 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:28 np0005539504 systemd[236689]: Activating special unit Exit the Session...
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped target Main User Target.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped target Basic System.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped target Paths.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped target Sockets.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped target Timers.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:25:28 np0005539504 systemd[236689]: Closed D-Bus User Message Bus Socket.
Nov 29 02:25:28 np0005539504 systemd[236689]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:25:28 np0005539504 systemd[236689]: Removed slice User Application Slice.
Nov 29 02:25:28 np0005539504 systemd[236689]: Reached target Shutdown.
Nov 29 02:25:28 np0005539504 systemd[236689]: Finished Exit the Session.
Nov 29 02:25:28 np0005539504 systemd[236689]: Reached target Exit the Session.
Nov 29 02:25:28 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:25:28 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:25:28 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:25:28 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:25:28 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:25:28 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:25:28 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:25:28 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:25:28 np0005539504 podman[236896]: 2025-11-29 07:25:28.734220824 +0000 UTC m=+0.067147290 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:25:29 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:29Z|00480|binding|INFO|Releasing lport 06f4ec62-8b16-4a76-9398-b2117639cd20 from this chassis (sb_readonly=0)
Nov 29 02:25:29 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:29Z|00481|binding|INFO|Releasing lport e3e5d9ef-c03b-4d54-8f92-11c237a85862 from this chassis (sb_readonly=0)
Nov 29 02:25:29 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:29Z|00482|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 02:25:29 np0005539504 nova_compute[187152]: 2025-11-29 07:25:29.535 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.051 187156 DEBUG nova.compute.manager [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.144 187156 INFO nova.compute.manager [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] instance snapshotting#033[00m
Nov 29 02:25:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:30.386 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:30.388 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.390 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.402 187156 INFO nova.virt.libvirt.driver [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Beginning live snapshot process#033[00m
Nov 29 02:25:30 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.600 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.665 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.667 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.723 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.738 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.797 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.799 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpf86na4em/dfb2c9d4d0c842af9048b6467ede7f1e.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.839 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpf86na4em/dfb2c9d4d0c842af9048b6467ede7f1e.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.841 187156 INFO nova.virt.libvirt.driver [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.930 187156 DEBUG nova.virt.libvirt.guest [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.935 187156 INFO nova.virt.libvirt.driver [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.984 187156 DEBUG nova.privsep.utils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:25:30 np0005539504 nova_compute[187152]: 2025-11-29 07:25:30.984 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpf86na4em/dfb2c9d4d0c842af9048b6467ede7f1e.delta /var/lib/nova/instances/snapshots/tmpf86na4em/dfb2c9d4d0c842af9048b6467ede7f1e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:31 np0005539504 nova_compute[187152]: 2025-11-29 07:25:31.045 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:31 np0005539504 nova_compute[187152]: 2025-11-29 07:25:31.250 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:31 np0005539504 nova_compute[187152]: 2025-11-29 07:25:31.280 187156 DEBUG oslo_concurrency.processutils [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpf86na4em/dfb2c9d4d0c842af9048b6467ede7f1e.delta /var/lib/nova/instances/snapshots/tmpf86na4em/dfb2c9d4d0c842af9048b6467ede7f1e" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:31 np0005539504 nova_compute[187152]: 2025-11-29 07:25:31.282 187156 INFO nova.virt.libvirt.driver [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:25:32 np0005539504 nova_compute[187152]: 2025-11-29 07:25:32.165 187156 DEBUG nova.compute.manager [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:32 np0005539504 nova_compute[187152]: 2025-11-29 07:25:32.166 187156 DEBUG oslo_concurrency.lockutils [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:32 np0005539504 nova_compute[187152]: 2025-11-29 07:25:32.166 187156 DEBUG oslo_concurrency.lockutils [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:32 np0005539504 nova_compute[187152]: 2025-11-29 07:25:32.167 187156 DEBUG oslo_concurrency.lockutils [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:32 np0005539504 nova_compute[187152]: 2025-11-29 07:25:32.167 187156 DEBUG nova.compute.manager [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:32 np0005539504 nova_compute[187152]: 2025-11-29 07:25:32.168 187156 WARNING nova.compute.manager [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:25:32 np0005539504 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:25:32 np0005539504 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:25:32 np0005539504 systemd-logind[783]: New session 62 of user nova.
Nov 29 02:25:32 np0005539504 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:25:32 np0005539504 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:25:33 np0005539504 systemd[236948]: Queued start job for default target Main User Target.
Nov 29 02:25:33 np0005539504 systemd[236948]: Created slice User Application Slice.
Nov 29 02:25:33 np0005539504 systemd[236948]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:25:33 np0005539504 systemd[236948]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:25:33 np0005539504 systemd[236948]: Reached target Paths.
Nov 29 02:25:33 np0005539504 systemd[236948]: Reached target Timers.
Nov 29 02:25:33 np0005539504 systemd[236948]: Starting D-Bus User Message Bus Socket...
Nov 29 02:25:33 np0005539504 systemd[236948]: Starting Create User's Volatile Files and Directories...
Nov 29 02:25:33 np0005539504 systemd[236948]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:25:33 np0005539504 systemd[236948]: Reached target Sockets.
Nov 29 02:25:33 np0005539504 systemd[236948]: Finished Create User's Volatile Files and Directories.
Nov 29 02:25:33 np0005539504 systemd[236948]: Reached target Basic System.
Nov 29 02:25:33 np0005539504 systemd[236948]: Reached target Main User Target.
Nov 29 02:25:33 np0005539504 systemd[236948]: Startup finished in 155ms.
Nov 29 02:25:33 np0005539504 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:25:33 np0005539504 systemd[1]: Started Session 62 of User nova.
Nov 29 02:25:33 np0005539504 nova_compute[187152]: 2025-11-29 07:25:33.494 187156 INFO nova.virt.libvirt.driver [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Snapshot image upload complete#033[00m
Nov 29 02:25:33 np0005539504 nova_compute[187152]: 2025-11-29 07:25:33.496 187156 INFO nova.compute.manager [None req-5303bee0-1230-4a0b-81c8-1905ce053189 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Took 3.33 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:25:33 np0005539504 systemd[1]: session-62.scope: Deactivated successfully.
Nov 29 02:25:33 np0005539504 systemd-logind[783]: Session 62 logged out. Waiting for processes to exit.
Nov 29 02:25:33 np0005539504 systemd-logind[783]: Removed session 62.
Nov 29 02:25:33 np0005539504 systemd-logind[783]: New session 64 of user nova.
Nov 29 02:25:33 np0005539504 systemd[1]: Started Session 64 of User nova.
Nov 29 02:25:33 np0005539504 systemd[1]: session-64.scope: Deactivated successfully.
Nov 29 02:25:33 np0005539504 systemd-logind[783]: Session 64 logged out. Waiting for processes to exit.
Nov 29 02:25:33 np0005539504 systemd-logind[783]: Removed session 64.
Nov 29 02:25:33 np0005539504 systemd-logind[783]: New session 65 of user nova.
Nov 29 02:25:33 np0005539504 systemd[1]: Started Session 65 of User nova.
Nov 29 02:25:34 np0005539504 systemd[1]: session-65.scope: Deactivated successfully.
Nov 29 02:25:34 np0005539504 systemd-logind[783]: Session 65 logged out. Waiting for processes to exit.
Nov 29 02:25:34 np0005539504 systemd-logind[783]: Removed session 65.
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.247 187156 DEBUG nova.compute.manager [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.250 187156 DEBUG oslo_concurrency.lockutils [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.250 187156 DEBUG oslo_concurrency.lockutils [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.251 187156 DEBUG oslo_concurrency.lockutils [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.251 187156 DEBUG nova.compute.manager [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.252 187156 WARNING nova.compute.manager [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:25:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:34.391 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:34 np0005539504 nova_compute[187152]: 2025-11-29 07:25:34.746 187156 INFO nova.network.neutron [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating port bae72aab-bece-4ddf-8a55-f5925e45ca90 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:25:35 np0005539504 podman[236976]: 2025-11-29 07:25:35.748318653 +0000 UTC m=+0.085776959 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 29 02:25:36 np0005539504 nova_compute[187152]: 2025-11-29 07:25:36.048 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:36 np0005539504 nova_compute[187152]: 2025-11-29 07:25:36.251 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.337 187156 INFO nova.compute.manager [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Rescuing#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.338 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.339 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.339 187156 DEBUG nova.network.neutron [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.639 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.640 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.640 187156 DEBUG nova.network.neutron [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.837 187156 DEBUG nova.compute.manager [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-changed-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.837 187156 DEBUG nova.compute.manager [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Refreshing instance network info cache due to event network-changed-bae72aab-bece-4ddf-8a55-f5925e45ca90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:38 np0005539504 nova_compute[187152]: 2025-11-29 07:25:38.838 187156 DEBUG oslo_concurrency.lockutils [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:40 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:40Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:25:40 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:40Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:25:40 np0005539504 nova_compute[187152]: 2025-11-29 07:25:40.903 187156 DEBUG nova.network.neutron [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updating instance_info_cache with network_info: [{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:40 np0005539504 nova_compute[187152]: 2025-11-29 07:25:40.924 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.052 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.090 187156 DEBUG nova.network.neutron [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.110 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.118 187156 DEBUG oslo_concurrency.lockutils [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.118 187156 DEBUG nova.network.neutron [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Refreshing network info cache for port bae72aab-bece-4ddf-8a55-f5925e45ca90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.126 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.239 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.242 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.242 187156 INFO nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Creating image(s)#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.244 187156 DEBUG nova.objects.instance [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.247 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.253 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.256 187156 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.339 187156 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.340 187156 DEBUG nova.virt.disk.api [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.340 187156 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.406 187156 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.407 187156 DEBUG nova.virt.disk.api [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.421 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.421 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Ensure instance console log exists: /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.422 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.422 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.422 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.425 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Start _get_guest_xml network_info=[{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.430 187156 WARNING nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.436 187156 DEBUG nova.virt.libvirt.host [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.437 187156 DEBUG nova.virt.libvirt.host [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.443 187156 DEBUG nova.virt.libvirt.host [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.443 187156 DEBUG nova.virt.libvirt.host [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.444 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.445 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.445 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.446 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.446 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.446 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.446 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.447 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.447 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.447 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.447 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.448 187156 DEBUG nova.virt.hardware [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.448 187156 DEBUG nova.objects.instance [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.464 187156 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.523 187156 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.524 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.524 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.525 187156 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.527 187156 DEBUG nova.virt.libvirt.vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:34Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.527 187156 DEBUG nova.network.os_vif_util [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.528 187156 DEBUG nova.network.os_vif_util [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.530 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <uuid>3b61bf63-8328-4d31-93e5-0a19ca27cd63</uuid>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <name>instance-0000007d</name>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <memory>196608</memory>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-284566059</nova:name>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:25:41</nova:creationTime>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.micro">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:memory>192</nova:memory>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        <nova:port uuid="bae72aab-bece-4ddf-8a55-f5925e45ca90">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <entry name="serial">3b61bf63-8328-4d31-93e5-0a19ca27cd63</entry>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <entry name="uuid">3b61bf63-8328-4d31-93e5-0a19ca27cd63</entry>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:04:b9:c5"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <target dev="tapbae72aab-be"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/console.log" append="off"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:25:41 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:25:41 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:25:41 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:25:41 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.533 187156 DEBUG nova.virt.libvirt.vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:34Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.534 187156 DEBUG nova.network.os_vif_util [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.535 187156 DEBUG nova.network.os_vif_util [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.536 187156 DEBUG os_vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.538 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.539 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.540 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.545 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.545 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbae72aab-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.546 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbae72aab-be, col_values=(('external_ids', {'iface-id': 'bae72aab-bece-4ddf-8a55-f5925e45ca90', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:b9:c5', 'vm-uuid': '3b61bf63-8328-4d31-93e5-0a19ca27cd63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.550 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.5506] manager: (tapbae72aab-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.559 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.559 187156 INFO os_vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be')#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.609 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.610 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.610 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:04:b9:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.611 187156 INFO nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Using config drive#033[00m
Nov 29 02:25:41 np0005539504 kernel: tapbae72aab-be: entered promiscuous mode
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.6693] manager: (tapbae72aab-be): new Tun device (/org/freedesktop/NetworkManager/Devices/225)
Nov 29 02:25:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:41Z|00483|binding|INFO|Claiming lport bae72aab-bece-4ddf-8a55-f5925e45ca90 for this chassis.
Nov 29 02:25:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:41Z|00484|binding|INFO|bae72aab-bece-4ddf-8a55-f5925e45ca90: Claiming fa:16:3e:04:b9:c5 10.100.0.9
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.672 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.680 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:b9:c5 10.100.0.9'], port_security=['fa:16:3e:04:b9:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3b61bf63-8328-4d31-93e5-0a19ca27cd63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '6', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=bae72aab-bece-4ddf-8a55-f5925e45ca90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.682 104164 INFO neutron.agent.ovn.metadata.agent [-] Port bae72aab-bece-4ddf-8a55-f5925e45ca90 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.685 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:25:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:41Z|00485|binding|INFO|Setting lport bae72aab-bece-4ddf-8a55-f5925e45ca90 ovn-installed in OVS
Nov 29 02:25:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:41Z|00486|binding|INFO|Setting lport bae72aab-bece-4ddf-8a55-f5925e45ca90 up in Southbound
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.695 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.699 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a920832b-1b21-4b52-9d5f-e53e725133d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.700 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.703 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.703 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ef4bd8-82c0-491b-a4d8-30933ee18b17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.705 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[99e5fa13-36aa-4aa7-85c3-ed31f4dbfe80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 systemd-udevd[237041]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.719 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[cfef6b8a-0db4-4f11-a15b-52114e88a50d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 systemd-machined[153423]: New machine qemu-64-instance-0000007d.
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.7302] device (tapbae72aab-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.7315] device (tapbae72aab-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.735 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfa9d1a-702f-4899-8039-f5af467d79e3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 systemd[1]: Started Virtual Machine qemu-64-instance-0000007d.
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.765 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b07319-0bee-407a-8d14-edea415077d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.7721] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/226)
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.771 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[179518cd-4643-4ac7-8d78-a5f6ef8193a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.804 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[772919cd-23bf-4f92-9b29-1375cd9410e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.806 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[13c60061-d89d-4012-a832-442087aa59be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.8297] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.835 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[63e2f638-5a64-4be4-867e-bc68e49664f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.851 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ac411f4a-f89f-4ade-a7c4-23236afea2db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660470, 'reachable_time': 44936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237074, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.866 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ec1b11-7b11-4e35-aa79-e449e2b87976]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 660470, 'tstamp': 660470}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237075, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.882 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[374e74d6-5fe9-41f8-9f96-81bbb164f157]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660470, 'reachable_time': 44936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237076, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.912 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca2ef3b-d312-441a-bd3c-73734a773752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.947 187156 DEBUG nova.compute.manager [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.947 187156 DEBUG oslo_concurrency.lockutils [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.948 187156 DEBUG oslo_concurrency.lockutils [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.948 187156 DEBUG oslo_concurrency.lockutils [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.948 187156 DEBUG nova.compute.manager [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.948 187156 WARNING nova.compute.manager [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.974 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[113915dc-67d9-4f87-b85b-1b8e072681e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.975 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.976 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.977 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:25:41 np0005539504 NetworkManager[55210]: <info>  [1764401141.9794] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.983 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:41Z|00487|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.987 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.988 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[babf63a4-9cae-459f-bea3-bf3cf03c29b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.989 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:41.990 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:41 np0005539504 nova_compute[187152]: 2025-11-29 07:25:41.995 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.142 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401142.141877, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.143 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.145 187156 DEBUG nova.compute.manager [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.148 187156 INFO nova.virt.libvirt.driver [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance running successfully.#033[00m
Nov 29 02:25:42 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.150 187156 DEBUG nova.virt.libvirt.guest [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.150 187156 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.166 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.169 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.187 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.188 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401142.1428823, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.188 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Started (Lifecycle Event)#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.214 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.217 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:42 np0005539504 nova_compute[187152]: 2025-11-29 07:25:42.242 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:25:42 np0005539504 podman[237115]: 2025-11-29 07:25:42.391693 +0000 UTC m=+0.057329297 container create c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:25:42 np0005539504 systemd[1]: Started libpod-conmon-c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69.scope.
Nov 29 02:25:42 np0005539504 podman[237115]: 2025-11-29 07:25:42.358060869 +0000 UTC m=+0.023697196 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:25:42 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:25:42 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a05b8a2b9e4864bcf62a39a1797ac7704fe21df0f7d9d0459a0a0b50e98ca7f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:25:42 np0005539504 podman[237115]: 2025-11-29 07:25:42.485126834 +0000 UTC m=+0.150763141 container init c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:42 np0005539504 podman[237115]: 2025-11-29 07:25:42.492788459 +0000 UTC m=+0.158424746 container start c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:25:42 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [NOTICE]   (237134) : New worker (237136) forked
Nov 29 02:25:42 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [NOTICE]   (237134) : Loading success.
Nov 29 02:25:43 np0005539504 kernel: tap86264ec7-05 (unregistering): left promiscuous mode
Nov 29 02:25:43 np0005539504 NetworkManager[55210]: <info>  [1764401143.4485] device (tap86264ec7-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:25:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:43Z|00488|binding|INFO|Releasing lport 86264ec7-05bf-4512-ac97-016779ba241a from this chassis (sb_readonly=0)
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.457 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:43Z|00489|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a down in Southbound
Nov 29 02:25:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:43Z|00490|binding|INFO|Removing iface tap86264ec7-05 ovn-installed in OVS
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.463 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.469 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:25:aa 10.100.0.4'], port_security=['fa:16:3e:03:25:aa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=86264ec7-05bf-4512-ac97-016779ba241a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.471 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 86264ec7-05bf-4512-ac97-016779ba241a in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.473 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.474 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e31937db-9760-4cb9-b7d4-376d6364f414]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.475 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace which is not needed anymore#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.486 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.487 187156 DEBUG nova.network.neutron [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updated VIF entry in instance network info cache for port bae72aab-bece-4ddf-8a55-f5925e45ca90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.488 187156 DEBUG nova.network.neutron [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.504 187156 DEBUG oslo_concurrency.lockutils [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:43 np0005539504 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Nov 29 02:25:43 np0005539504 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007e.scope: Consumed 14.191s CPU time.
Nov 29 02:25:43 np0005539504 systemd-machined[153423]: Machine qemu-63-instance-0000007e terminated.
Nov 29 02:25:43 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [NOTICE]   (236885) : haproxy version is 2.8.14-c23fe91
Nov 29 02:25:43 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [NOTICE]   (236885) : path to executable is /usr/sbin/haproxy
Nov 29 02:25:43 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [WARNING]  (236885) : Exiting Master process...
Nov 29 02:25:43 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [ALERT]    (236885) : Current worker (236887) exited with code 143 (Terminated)
Nov 29 02:25:43 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[236881]: [WARNING]  (236885) : All workers exited. Exiting... (0)
Nov 29 02:25:43 np0005539504 systemd[1]: libpod-8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39.scope: Deactivated successfully.
Nov 29 02:25:43 np0005539504 podman[237168]: 2025-11-29 07:25:43.607593696 +0000 UTC m=+0.042926300 container died 8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:25:43 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39-userdata-shm.mount: Deactivated successfully.
Nov 29 02:25:43 np0005539504 systemd[1]: var-lib-containers-storage-overlay-2a907aa0be360b72e7cf4f600814da2712ee6184d4bfd8e90604ff41a69dd664-merged.mount: Deactivated successfully.
Nov 29 02:25:43 np0005539504 podman[237168]: 2025-11-29 07:25:43.649574341 +0000 UTC m=+0.084906965 container cleanup 8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:25:43 np0005539504 systemd[1]: libpod-conmon-8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39.scope: Deactivated successfully.
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.689 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.697 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 podman[237199]: 2025-11-29 07:25:43.727082818 +0000 UTC m=+0.057167573 container remove 8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.732 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ac3ad676-8fa6-443f-8698-95801ce9795b]: (4, ('Sat Nov 29 07:25:43 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39)\n8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39\nSat Nov 29 07:25:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39)\n8dd7876650d13e75112cf6f1e1c03a95449a4d9ddde091bc5876bd4674042a39\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.733 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5cf587-2854-40c0-805d-250ab57e1ad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.734 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.736 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 kernel: tap240f16d8-60: left promiscuous mode
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.749 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 nova_compute[187152]: 2025-11-29 07:25:43.755 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.758 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e6bab8-70f3-4217-aa2f-c6139bdffa41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.777 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[abdfa593-1541-466f-b62e-a65fa165a2b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.780 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[99ea9cc3-77ee-4213-bbee-0326405c42b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.796 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5df98ff0-ee8c-4d53-9c6a-da1c2cb2b787]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 658654, 'reachable_time': 28214, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237232, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:43 np0005539504 systemd[1]: run-netns-ovnmeta\x2d240f16d8\x2d602b\x2d4aa1\x2d8edb\x2de3a8d3674e39.mount: Deactivated successfully.
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.799 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:25:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:43.799 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[44b7eea7-2166-4225-b141-74fc1edcd860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.067 187156 DEBUG nova.compute.manager [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.067 187156 DEBUG oslo_concurrency.lockutils [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.067 187156 DEBUG oslo_concurrency.lockutils [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.068 187156 DEBUG oslo_concurrency.lockutils [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.068 187156 DEBUG nova.compute.manager [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.068 187156 WARNING nova.compute.manager [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:25:44 np0005539504 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:25:44 np0005539504 systemd[236948]: Activating special unit Exit the Session...
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped target Main User Target.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped target Basic System.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped target Paths.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped target Sockets.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped target Timers.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:25:44 np0005539504 systemd[236948]: Closed D-Bus User Message Bus Socket.
Nov 29 02:25:44 np0005539504 systemd[236948]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:25:44 np0005539504 systemd[236948]: Removed slice User Application Slice.
Nov 29 02:25:44 np0005539504 systemd[236948]: Reached target Shutdown.
Nov 29 02:25:44 np0005539504 systemd[236948]: Finished Exit the Session.
Nov 29 02:25:44 np0005539504 systemd[236948]: Reached target Exit the Session.
Nov 29 02:25:44 np0005539504 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:25:44 np0005539504 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:25:44 np0005539504 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:25:44 np0005539504 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:25:44 np0005539504 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:25:44 np0005539504 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:25:44 np0005539504 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.256 187156 DEBUG nova.compute.manager [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.256 187156 DEBUG oslo_concurrency.lockutils [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.256 187156 DEBUG oslo_concurrency.lockutils [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.257 187156 DEBUG oslo_concurrency.lockutils [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.257 187156 DEBUG nova.compute.manager [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.257 187156 WARNING nova.compute.manager [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.257 187156 DEBUG nova.compute.manager [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.257 187156 DEBUG oslo_concurrency.lockutils [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.257 187156 DEBUG oslo_concurrency.lockutils [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.258 187156 DEBUG oslo_concurrency.lockutils [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.258 187156 DEBUG nova.compute.manager [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.258 187156 WARNING nova.compute.manager [req-37bba66e-0618-44e8-b98b-1fd1b31cc280 req-f6e48214-7a40-4bfe-908d-3f64010ec992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.298 187156 INFO nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.303 187156 INFO nova.virt.libvirt.driver [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance destroyed successfully.#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.303 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.319 187156 INFO nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Attempting a stable device rescue#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.562 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.565 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.566 187156 INFO nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Creating image(s)#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.567 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.567 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.568 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.568 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.584 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "1f6bbea7c12e7fd60429d8192b4eff988ab580c0" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539504 nova_compute[187152]: 2025-11-29 07:25:44.584 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "1f6bbea7c12e7fd60429d8192b4eff988ab580c0" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.040 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.132 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.part --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.133 187156 DEBUG nova.virt.images [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] 927e2e04-1294-4096-a88f-b0fbbd649db0 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.134 187156 DEBUG nova.privsep.utils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.135 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.part /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.256 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.326 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.part /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.converted" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.332 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.406 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.407 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "1f6bbea7c12e7fd60429d8192b4eff988ab580c0" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.423 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "1f6bbea7c12e7fd60429d8192b4eff988ab580c0" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.424 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "1f6bbea7c12e7fd60429d8192b4eff988ab580c0" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.434 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.500 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.502 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0,backing_fmt=raw /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.544 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0,backing_fmt=raw /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.rescue" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.545 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "1f6bbea7c12e7fd60429d8192b4eff988ab580c0" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.546 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.549 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.584 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.587 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Start _get_guest_xml network_info=[{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:03:25:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '927e2e04-1294-4096-a88f-b0fbbd649db0', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.587 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'resources' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.619 187156 WARNING nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.624 187156 DEBUG nova.virt.libvirt.host [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.624 187156 DEBUG nova.virt.libvirt.host [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.627 187156 DEBUG nova.virt.libvirt.host [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.628 187156 DEBUG nova.virt.libvirt.host [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.629 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.629 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.630 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.630 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.630 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.630 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.631 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.631 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.631 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.631 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.631 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.632 187156 DEBUG nova.virt.hardware [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.632 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.660 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.725 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.727 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.727 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.729 187156 DEBUG oslo_concurrency.lockutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.731 187156 DEBUG nova.virt.libvirt.vif [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1803543286',display_name='tempest-ServerStableDeviceRescueTest-server-1803543286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1803543286',id=126,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-9i8if1ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:33Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=220b7865-2248-43ba-865a-b2314b5a6e47,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:03:25:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.733 187156 DEBUG nova.network.os_vif_util [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:03:25:aa"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.736 187156 DEBUG nova.network.os_vif_util [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.738 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.757 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <uuid>220b7865-2248-43ba-865a-b2314b5a6e47</uuid>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <name>instance-0000007e</name>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1803543286</nova:name>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:25:46</nova:creationTime>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:user uuid="5be41a8530314f83bbecbb74b9276f2d">tempest-ServerStableDeviceRescueTest-2012111838-project-member</nova:user>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:project uuid="ac3bb322fa744e099b38e08abe12d0e2">tempest-ServerStableDeviceRescueTest-2012111838</nova:project>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        <nova:port uuid="86264ec7-05bf-4512-ac97-016779ba241a">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <entry name="serial">220b7865-2248-43ba-865a-b2314b5a6e47</entry>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <entry name="uuid">220b7865-2248-43ba-865a-b2314b5a6e47</entry>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.rescue"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <target dev="sdb" bus="usb"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <boot order="1"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:03:25:aa"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <target dev="tap86264ec7-05"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/console.log" append="off"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:25:46 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:25:46 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:25:46 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:25:46 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.775 187156 INFO nova.virt.libvirt.driver [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance destroyed successfully.#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.840 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.840 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.841 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.841 187156 DEBUG nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No VIF found with MAC fa:16:3e:03:25:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.842 187156 INFO nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Using config drive#033[00m
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.860 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:46 np0005539504 podman[237257]: 2025-11-29 07:25:46.887767488 +0000 UTC m=+0.075957596 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:25:46 np0005539504 podman[237258]: 2025-11-29 07:25:46.897726375 +0000 UTC m=+0.080242871 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:46 np0005539504 nova_compute[187152]: 2025-11-29 07:25:46.897 187156 DEBUG nova.objects.instance [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'keypairs' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:46 np0005539504 podman[237259]: 2025-11-29 07:25:46.921748888 +0000 UTC m=+0.090825224 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.540 187156 INFO nova.virt.libvirt.driver [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Creating config drive at /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config.rescue#033[00m
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.544 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ws5y29e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.670 187156 DEBUG oslo_concurrency.processutils [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ws5y29e" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:47 np0005539504 NetworkManager[55210]: <info>  [1764401147.7592] manager: (tap86264ec7-05): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Nov 29 02:25:47 np0005539504 kernel: tap86264ec7-05: entered promiscuous mode
Nov 29 02:25:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:47Z|00491|binding|INFO|Claiming lport 86264ec7-05bf-4512-ac97-016779ba241a for this chassis.
Nov 29 02:25:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:47Z|00492|binding|INFO|86264ec7-05bf-4512-ac97-016779ba241a: Claiming fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.771 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:47Z|00493|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a ovn-installed in OVS
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.802 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:47 np0005539504 nova_compute[187152]: 2025-11-29 07:25:47.806 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:47 np0005539504 systemd-machined[153423]: New machine qemu-65-instance-0000007e.
Nov 29 02:25:47 np0005539504 systemd[1]: Started Virtual Machine qemu-65-instance-0000007e.
Nov 29 02:25:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:47Z|00494|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a up in Southbound
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.840 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:25:aa 10.100.0.4'], port_security=['fa:16:3e:03:25:aa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=86264ec7-05bf-4512-ac97-016779ba241a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.841 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 86264ec7-05bf-4512-ac97-016779ba241a in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.843 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.861 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[db56883b-b9e7-433f-a15b-029f96e46e5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.862 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap240f16d8-61 in ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.864 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap240f16d8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.864 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[55b392fb-b6ac-432e-bb70-1d3a306cd64c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.865 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3f001d-e2e6-414d-8274-b66036bff6ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 systemd-udevd[237339]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:47 np0005539504 NetworkManager[55210]: <info>  [1764401147.8880] device (tap86264ec7-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:47 np0005539504 NetworkManager[55210]: <info>  [1764401147.8890] device (tap86264ec7-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.894 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3ab59f-4b14-46d5-8ef2-732fbfa068f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.921 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[18146ced-54fe-4a25-b83f-5d9980ba7e58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.952 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[25b1c4ca-acbc-4227-bb3c-4a8c1798cf36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 NetworkManager[55210]: <info>  [1764401147.9584] manager: (tap240f16d8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Nov 29 02:25:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:47.957 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a7bce4-b7a7-4b0f-9c55-346ca3ea583a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:47 np0005539504 systemd-udevd[237342]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.002 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[104f56bd-83c0-4a1c-93ac-0b31e83f1967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.005 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6ea256-3a20-4481-9425-4e14c6bf9b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 NetworkManager[55210]: <info>  [1764401148.0482] device (tap240f16d8-60): carrier: link connected
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.055 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[be4721e1-6d27-4569-b45a-03976dd65811]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.077 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[12b63e80-1c88-4bd9-88d9-44e937ca026a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661092, 'reachable_time': 18297, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237371, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.097 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[88d44774-fca0-4f8f-ad99-8b6402d2658c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:7e40'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661092, 'tstamp': 661092}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237372, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.123 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8914f405-43c9-4506-9895-c4be02e36945]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661092, 'reachable_time': 18297, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237373, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.170 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6df10f7e-d537-456a-a3ae-0fafcce31387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.239 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[782fec23-d8ea-4af2-9e56-bb450aaa5c0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.240 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.240 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.241 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:48 np0005539504 NetworkManager[55210]: <info>  [1764401148.2435] manager: (tap240f16d8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Nov 29 02:25:48 np0005539504 nova_compute[187152]: 2025-11-29 07:25:48.242 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:48 np0005539504 kernel: tap240f16d8-60: entered promiscuous mode
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.247 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:48Z|00495|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 02:25:48 np0005539504 nova_compute[187152]: 2025-11-29 07:25:48.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:48 np0005539504 nova_compute[187152]: 2025-11-29 07:25:48.267 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.267 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.268 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[88169bd6-5425-4cb2-a7e7-27054bd02f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.269 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:48.270 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'env', 'PROCESS_TAG=haproxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/240f16d8-602b-4aa1-8edb-e3a8d3674e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:48 np0005539504 nova_compute[187152]: 2025-11-29 07:25:48.598 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 220b7865-2248-43ba-865a-b2314b5a6e47 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:25:48 np0005539504 nova_compute[187152]: 2025-11-29 07:25:48.600 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401148.5933187, 220b7865-2248-43ba-865a-b2314b5a6e47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:48 np0005539504 nova_compute[187152]: 2025-11-29 07:25:48.600 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:25:48 np0005539504 podman[237412]: 2025-11-29 07:25:48.653156456 +0000 UTC m=+0.029981615 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.281 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.290 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.766 187156 DEBUG nova.compute.manager [None req-fef638a4-dd27-4c30-a0f6-f9add8e0e534 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.769 187156 DEBUG nova.compute.manager [req-e06dea7f-3cca-4f0b-b320-e0296a75faad req-b7876230-cb83-4005-b3ba-d5d6e8382927 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.770 187156 DEBUG oslo_concurrency.lockutils [req-e06dea7f-3cca-4f0b-b320-e0296a75faad req-b7876230-cb83-4005-b3ba-d5d6e8382927 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.771 187156 DEBUG oslo_concurrency.lockutils [req-e06dea7f-3cca-4f0b-b320-e0296a75faad req-b7876230-cb83-4005-b3ba-d5d6e8382927 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.771 187156 DEBUG oslo_concurrency.lockutils [req-e06dea7f-3cca-4f0b-b320-e0296a75faad req-b7876230-cb83-4005-b3ba-d5d6e8382927 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.772 187156 DEBUG nova.compute.manager [req-e06dea7f-3cca-4f0b-b320-e0296a75faad req-b7876230-cb83-4005-b3ba-d5d6e8382927 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.772 187156 WARNING nova.compute.manager [req-e06dea7f-3cca-4f0b-b320-e0296a75faad req-b7876230-cb83-4005-b3ba-d5d6e8382927 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.782 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.783 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401148.5941079, 220b7865-2248-43ba-865a-b2314b5a6e47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.784 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Started (Lifecycle Event)#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.854 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.860 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:49 np0005539504 podman[237412]: 2025-11-29 07:25:49.869744761 +0000 UTC m=+1.246569840 container create 69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:25:49 np0005539504 nova_compute[187152]: 2025-11-29 07:25:49.906 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:25:49 np0005539504 systemd[1]: Started libpod-conmon-69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47.scope.
Nov 29 02:25:49 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:25:49 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a46c58141163399ba422089dfcd8c65d30552077e74d206bee337b8a2676796d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:25:49 np0005539504 podman[237412]: 2025-11-29 07:25:49.974342974 +0000 UTC m=+1.351168093 container init 69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:25:49 np0005539504 podman[237412]: 2025-11-29 07:25:49.980465487 +0000 UTC m=+1.357290576 container start 69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:25:50 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [NOTICE]   (237431) : New worker (237433) forked
Nov 29 02:25:50 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [NOTICE]   (237431) : Loading success.
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.741 187156 INFO nova.compute.manager [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Unrescuing#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.742 187156 DEBUG oslo_concurrency.lockutils [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.743 187156 DEBUG oslo_concurrency.lockutils [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.743 187156 DEBUG nova.network.neutron [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.748 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.748 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.749 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.750 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:50 np0005539504 nova_compute[187152]: 2025-11-29 07:25:50.752 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.261 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.391 187156 INFO nova.compute.manager [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Terminating instance#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.551 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.682 187156 DEBUG nova.compute.manager [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:25:51 np0005539504 kernel: tapbae72aab-be (unregistering): left promiscuous mode
Nov 29 02:25:51 np0005539504 NetworkManager[55210]: <info>  [1764401151.7078] device (tapbae72aab-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:25:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:51Z|00496|binding|INFO|Releasing lport bae72aab-bece-4ddf-8a55-f5925e45ca90 from this chassis (sb_readonly=0)
Nov 29 02:25:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:51Z|00497|binding|INFO|Setting lport bae72aab-bece-4ddf-8a55-f5925e45ca90 down in Southbound
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.718 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:51Z|00498|binding|INFO|Removing iface tapbae72aab-be ovn-installed in OVS
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.719 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.741 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Nov 29 02:25:51 np0005539504 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000007d.scope: Consumed 10.018s CPU time.
Nov 29 02:25:51 np0005539504 systemd-machined[153423]: Machine qemu-64-instance-0000007d terminated.
Nov 29 02:25:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:51.842 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:b9:c5 10.100.0.9'], port_security=['fa:16:3e:04:b9:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3b61bf63-8328-4d31-93e5-0a19ca27cd63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '8', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=bae72aab-bece-4ddf-8a55-f5925e45ca90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:51.843 104164 INFO neutron.agent.ovn.metadata.agent [-] Port bae72aab-bece-4ddf-8a55-f5925e45ca90 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:25:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:51.845 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:25:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:51.847 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b98b98-c4a4-400d-9e5a-fee1dd51e1d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:51.849 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:25:51 np0005539504 podman[237449]: 2025-11-29 07:25:51.910613329 +0000 UTC m=+0.091410320 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.911 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.914 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.954 187156 INFO nova.virt.libvirt.driver [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance destroyed successfully.#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.955 187156 DEBUG nova.objects.instance [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.975 187156 DEBUG nova.virt.libvirt.vif [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:47Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.976 187156 DEBUG nova.network.os_vif_util [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.977 187156 DEBUG nova.network.os_vif_util [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.978 187156 DEBUG os_vif [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.980 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.981 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbae72aab-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.982 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.984 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.988 187156 INFO os_vif [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be')#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.989 187156 INFO nova.virt.libvirt.driver [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Deleting instance files /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_del#033[00m
Nov 29 02:25:51 np0005539504 nova_compute[187152]: 2025-11-29 07:25:51.994 187156 INFO nova.virt.libvirt.driver [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Deletion of /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_del complete#033[00m
Nov 29 02:25:52 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [NOTICE]   (237134) : haproxy version is 2.8.14-c23fe91
Nov 29 02:25:52 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [NOTICE]   (237134) : path to executable is /usr/sbin/haproxy
Nov 29 02:25:52 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [WARNING]  (237134) : Exiting Master process...
Nov 29 02:25:52 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [ALERT]    (237134) : Current worker (237136) exited with code 143 (Terminated)
Nov 29 02:25:52 np0005539504 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237130]: [WARNING]  (237134) : All workers exited. Exiting... (0)
Nov 29 02:25:52 np0005539504 systemd[1]: libpod-c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69.scope: Deactivated successfully.
Nov 29 02:25:52 np0005539504 nova_compute[187152]: 2025-11-29 07:25:52.335 187156 DEBUG nova.compute.manager [req-7ca21e6e-dda9-452f-b876-6a3cfce8aeca req-a47bfb10-4709-4361-8b47-add4c155a8a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:52 np0005539504 nova_compute[187152]: 2025-11-29 07:25:52.336 187156 DEBUG oslo_concurrency.lockutils [req-7ca21e6e-dda9-452f-b876-6a3cfce8aeca req-a47bfb10-4709-4361-8b47-add4c155a8a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:52 np0005539504 nova_compute[187152]: 2025-11-29 07:25:52.336 187156 DEBUG oslo_concurrency.lockutils [req-7ca21e6e-dda9-452f-b876-6a3cfce8aeca req-a47bfb10-4709-4361-8b47-add4c155a8a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:52 np0005539504 nova_compute[187152]: 2025-11-29 07:25:52.337 187156 DEBUG oslo_concurrency.lockutils [req-7ca21e6e-dda9-452f-b876-6a3cfce8aeca req-a47bfb10-4709-4361-8b47-add4c155a8a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:52 np0005539504 nova_compute[187152]: 2025-11-29 07:25:52.337 187156 DEBUG nova.compute.manager [req-7ca21e6e-dda9-452f-b876-6a3cfce8aeca req-a47bfb10-4709-4361-8b47-add4c155a8a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:52 np0005539504 nova_compute[187152]: 2025-11-29 07:25:52.338 187156 WARNING nova.compute.manager [req-7ca21e6e-dda9-452f-b876-6a3cfce8aeca req-a47bfb10-4709-4361-8b47-add4c155a8a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 29 02:25:52 np0005539504 podman[237512]: 2025-11-29 07:25:52.341958715 +0000 UTC m=+0.389967308 container died c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:25:52 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69-userdata-shm.mount: Deactivated successfully.
Nov 29 02:25:52 np0005539504 systemd[1]: var-lib-containers-storage-overlay-a05b8a2b9e4864bcf62a39a1797ac7704fe21df0f7d9d0459a0a0b50e98ca7f7-merged.mount: Deactivated successfully.
Nov 29 02:25:52 np0005539504 podman[237482]: 2025-11-29 07:25:52.866823537 +0000 UTC m=+0.946141219 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:25:53 np0005539504 podman[237512]: 2025-11-29 07:25:53.218276533 +0000 UTC m=+1.266285136 container cleanup c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.223 187156 DEBUG nova.compute.manager [req-2178e441-40d4-4fd8-ba36-d8f77cd4a7cb req-73055bc1-ca97-4d99-9db3-b8fa9dfe81bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.224 187156 DEBUG oslo_concurrency.lockutils [req-2178e441-40d4-4fd8-ba36-d8f77cd4a7cb req-73055bc1-ca97-4d99-9db3-b8fa9dfe81bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.224 187156 DEBUG oslo_concurrency.lockutils [req-2178e441-40d4-4fd8-ba36-d8f77cd4a7cb req-73055bc1-ca97-4d99-9db3-b8fa9dfe81bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.225 187156 DEBUG oslo_concurrency.lockutils [req-2178e441-40d4-4fd8-ba36-d8f77cd4a7cb req-73055bc1-ca97-4d99-9db3-b8fa9dfe81bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.226 187156 DEBUG nova.compute.manager [req-2178e441-40d4-4fd8-ba36-d8f77cd4a7cb req-73055bc1-ca97-4d99-9db3-b8fa9dfe81bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.226 187156 DEBUG nova.compute.manager [req-2178e441-40d4-4fd8-ba36-d8f77cd4a7cb req-73055bc1-ca97-4d99-9db3-b8fa9dfe81bc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.252 187156 INFO nova.compute.manager [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Took 1.57 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.253 187156 DEBUG oslo.service.loopingcall [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.254 187156 DEBUG nova.compute.manager [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.254 187156 DEBUG nova.network.neutron [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:25:53 np0005539504 podman[237561]: 2025-11-29 07:25:53.300763034 +0000 UTC m=+0.057101481 container remove c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.307 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f532bff2-517b-409f-a015-c8ef34ef7593]: (4, ('Sat Nov 29 07:25:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69)\nc22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69\nSat Nov 29 07:25:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69)\nc22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.309 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d09abd2c-d1fa-44b2-a0a5-768a0f7dd9da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.310 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:53 np0005539504 systemd[1]: libpod-conmon-c22b9f5415252857a09039a30f31eee2668427b2197b60cd010bba53f5c28a69.scope: Deactivated successfully.
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.312 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:53 np0005539504 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.330 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d58eda13-09b3-44d8-b6ce-fd2800c354f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.354 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[949991f7-1805-426b-abad-4c72543a9848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.357 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1696a6df-f842-4459-9c48-70a04c62f169]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.381 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8d636176-6789-4243-80bb-4f035767ad8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660464, 'reachable_time': 29436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237574, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.386 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:25:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:53.386 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[c69f0242-9eae-44d9-ac58-119dde369165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:53 np0005539504 nova_compute[187152]: 2025-11-29 07:25:53.517 187156 DEBUG nova.network.neutron [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updating instance_info_cache with network_info: [{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:55 np0005539504 nova_compute[187152]: 2025-11-29 07:25:55.414 187156 DEBUG oslo_concurrency.lockutils [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:55 np0005539504 nova_compute[187152]: 2025-11-29 07:25:55.416 187156 DEBUG nova.objects.instance [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'flavor' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:56 np0005539504 nova_compute[187152]: 2025-11-29 07:25:56.263 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:56 np0005539504 nova_compute[187152]: 2025-11-29 07:25:56.984 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:57 np0005539504 nova_compute[187152]: 2025-11-29 07:25:57.910 187156 DEBUG nova.compute.manager [req-dfe1a26f-46e6-44dd-ad4e-e52a18577aff req-53840d7a-9c1a-4954-92e0-ae8177d64ec6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:57 np0005539504 nova_compute[187152]: 2025-11-29 07:25:57.910 187156 DEBUG oslo_concurrency.lockutils [req-dfe1a26f-46e6-44dd-ad4e-e52a18577aff req-53840d7a-9c1a-4954-92e0-ae8177d64ec6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:57 np0005539504 nova_compute[187152]: 2025-11-29 07:25:57.911 187156 DEBUG oslo_concurrency.lockutils [req-dfe1a26f-46e6-44dd-ad4e-e52a18577aff req-53840d7a-9c1a-4954-92e0-ae8177d64ec6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:57 np0005539504 nova_compute[187152]: 2025-11-29 07:25:57.911 187156 DEBUG oslo_concurrency.lockutils [req-dfe1a26f-46e6-44dd-ad4e-e52a18577aff req-53840d7a-9c1a-4954-92e0-ae8177d64ec6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:57 np0005539504 nova_compute[187152]: 2025-11-29 07:25:57.912 187156 DEBUG nova.compute.manager [req-dfe1a26f-46e6-44dd-ad4e-e52a18577aff req-53840d7a-9c1a-4954-92e0-ae8177d64ec6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:57 np0005539504 nova_compute[187152]: 2025-11-29 07:25:57.912 187156 WARNING nova.compute.manager [req-dfe1a26f-46e6-44dd-ad4e-e52a18577aff req-53840d7a-9c1a-4954-92e0-ae8177d64ec6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:25:57 np0005539504 kernel: tap86264ec7-05 (unregistering): left promiscuous mode
Nov 29 02:25:58 np0005539504 NetworkManager[55210]: <info>  [1764401158.0045] device (tap86264ec7-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.016 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00499|binding|INFO|Releasing lport 86264ec7-05bf-4512-ac97-016779ba241a from this chassis (sb_readonly=0)
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00500|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a down in Southbound
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00501|binding|INFO|Removing iface tap86264ec7-05 ovn-installed in OVS
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.019 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Nov 29 02:25:58 np0005539504 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000007e.scope: Consumed 10.131s CPU time.
Nov 29 02:25:58 np0005539504 systemd-machined[153423]: Machine qemu-65-instance-0000007e terminated.
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.099 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:25:aa 10.100.0.4'], port_security=['fa:16:3e:03:25:aa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=86264ec7-05bf-4512-ac97-016779ba241a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.100 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 86264ec7-05bf-4512-ac97-016779ba241a in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.102 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.105 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8a2b37-f37d-4f6a-bdb2-36d69a94c790]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.106 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace which is not needed anymore#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.200 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.205 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.261 187156 INFO nova.virt.libvirt.driver [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance destroyed successfully.#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.262 187156 DEBUG nova.objects.instance [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:58 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [NOTICE]   (237431) : haproxy version is 2.8.14-c23fe91
Nov 29 02:25:58 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [NOTICE]   (237431) : path to executable is /usr/sbin/haproxy
Nov 29 02:25:58 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [WARNING]  (237431) : Exiting Master process...
Nov 29 02:25:58 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [WARNING]  (237431) : Exiting Master process...
Nov 29 02:25:58 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [ALERT]    (237431) : Current worker (237433) exited with code 143 (Terminated)
Nov 29 02:25:58 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237427]: [WARNING]  (237431) : All workers exited. Exiting... (0)
Nov 29 02:25:58 np0005539504 systemd[1]: libpod-69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47.scope: Deactivated successfully.
Nov 29 02:25:58 np0005539504 podman[237599]: 2025-11-29 07:25:58.516165182 +0000 UTC m=+0.310919431 container died 69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:58 np0005539504 kernel: tap86264ec7-05: entered promiscuous mode
Nov 29 02:25:58 np0005539504 systemd-udevd[237577]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:58 np0005539504 NetworkManager[55210]: <info>  [1764401158.6598] manager: (tap86264ec7-05): new Tun device (/org/freedesktop/NetworkManager/Devices/231)
Nov 29 02:25:58 np0005539504 NetworkManager[55210]: <info>  [1764401158.6939] device (tap86264ec7-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:58 np0005539504 NetworkManager[55210]: <info>  [1764401158.6949] device (tap86264ec7-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00502|binding|INFO|Claiming lport 86264ec7-05bf-4512-ac97-016779ba241a for this chassis.
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00503|binding|INFO|86264ec7-05bf-4512-ac97-016779ba241a: Claiming fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.695 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47-userdata-shm.mount: Deactivated successfully.
Nov 29 02:25:58 np0005539504 systemd[1]: var-lib-containers-storage-overlay-a46c58141163399ba422089dfcd8c65d30552077e74d206bee337b8a2676796d-merged.mount: Deactivated successfully.
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.704 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:25:aa 10.100.0.4'], port_security=['fa:16:3e:03:25:aa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=86264ec7-05bf-4512-ac97-016779ba241a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00504|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a ovn-installed in OVS
Nov 29 02:25:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:58Z|00505|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a up in Southbound
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.710 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 podman[237599]: 2025-11-29 07:25:58.718858692 +0000 UTC m=+0.513612931 container cleanup 69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:25:58 np0005539504 systemd-machined[153423]: New machine qemu-66-instance-0000007e.
Nov 29 02:25:58 np0005539504 systemd[1]: Started Virtual Machine qemu-66-instance-0000007e.
Nov 29 02:25:58 np0005539504 systemd[1]: libpod-conmon-69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47.scope: Deactivated successfully.
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.770 187156 DEBUG nova.network.neutron [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.803 187156 INFO nova.compute.manager [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Took 5.55 seconds to deallocate network for instance.#033[00m
Nov 29 02:25:58 np0005539504 podman[237662]: 2025-11-29 07:25:58.807896528 +0000 UTC m=+0.060769699 container remove 69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.814 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb2e41e-e86a-421c-b857-ea31da824fd2]: (4, ('Sat Nov 29 07:25:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47)\n69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47\nSat Nov 29 07:25:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47)\n69afef99452be68ae1dd494ce0a93cecaf55f551e01deed31c9fb46302adbd47\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.817 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50f6954f-ce8c-48ea-b415-606f66e43693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.818 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.820 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 kernel: tap240f16d8-60: left promiscuous mode
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.834 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.841 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[871c5236-0c75-4294-818d-cb31329cb8ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.854 187156 DEBUG nova.compute.manager [req-91dc9139-e693-4455-aae8-13b34921b05c req-3ff9c39f-ee19-4537-8749-ffa2f71d0279 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-deleted-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:58 np0005539504 podman[237663]: 2025-11-29 07:25:58.855603367 +0000 UTC m=+0.089605232 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.860 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ac764b88-74c0-4d1c-bcf1-553762efa074]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.862 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a0821d01-ea50-4e4f-bfd7-8e15184ff427]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.879 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d72e8711-9e13-4783-a949-2601b5b7ed15]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661082, 'reachable_time': 22525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237701, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 systemd[1]: run-netns-ovnmeta\x2d240f16d8\x2d602b\x2d4aa1\x2d8edb\x2de3a8d3674e39.mount: Deactivated successfully.
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.885 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.886 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[d21a876d-60da-4cc1-b0a9-2edfae3626b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.887 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 86264ec7-05bf-4512-ac97-016779ba241a in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.889 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.898 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.898 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.905 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[814e7809-db27-44c0-9f71-ce58557024e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.906 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap240f16d8-61 in ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.905 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.909 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap240f16d8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.909 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb2dc59-b9e2-41bc-a533-45806c0f02df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.910 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a039ee-bf71-4265-8727-e0045dc655ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.942 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.953 187156 DEBUG nova.compute.manager [req-04bea9dd-4ae8-4551-b5a3-1abd5241fe24 req-27e39fc8-e2c6-4993-95aa-87cc17fd89cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.954 187156 DEBUG oslo_concurrency.lockutils [req-04bea9dd-4ae8-4551-b5a3-1abd5241fe24 req-27e39fc8-e2c6-4993-95aa-87cc17fd89cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.954 187156 DEBUG oslo_concurrency.lockutils [req-04bea9dd-4ae8-4551-b5a3-1abd5241fe24 req-27e39fc8-e2c6-4993-95aa-87cc17fd89cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.954 187156 DEBUG oslo_concurrency.lockutils [req-04bea9dd-4ae8-4551-b5a3-1abd5241fe24 req-27e39fc8-e2c6-4993-95aa-87cc17fd89cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.955 187156 DEBUG nova.compute.manager [req-04bea9dd-4ae8-4551-b5a3-1abd5241fe24 req-27e39fc8-e2c6-4993-95aa-87cc17fd89cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:58 np0005539504 nova_compute[187152]: 2025-11-29 07:25:58.955 187156 WARNING nova.compute.manager [req-04bea9dd-4ae8-4551-b5a3-1abd5241fe24 req-27e39fc8-e2c6-4993-95aa-87cc17fd89cc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.968 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca6216a-8177-4653-9099-2b19714630e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:58.988 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[97994227-0346-42fa-bf69-de3794242c29]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.000 187156 INFO nova.scheduler.client.report [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocations for instance 3b61bf63-8328-4d31-93e5-0a19ca27cd63#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.037 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c3cd8a42-0768-4802-8715-b4195385d4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.047 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5445a275-7908-46f1-9b38-811fd64362ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 NetworkManager[55210]: <info>  [1764401159.0488] manager: (tap240f16d8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/232)
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.086 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[02586d47-cb32-4ff2-94d1-c2694ad53ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.090 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3225cd20-d02d-4b81-9784-b1acb88afd87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.118 187156 DEBUG oslo_concurrency.lockutils [None req-40643483-9ccc-4cf6-8e58-4233d0b4501d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:59 np0005539504 NetworkManager[55210]: <info>  [1764401159.1229] device (tap240f16d8-60): carrier: link connected
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.128 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef21be6-78d8-4134-8e7f-d6fcd23db817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.147 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 220b7865-2248-43ba-865a-b2314b5a6e47 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.147 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401159.1471243, 220b7865-2248-43ba-865a-b2314b5a6e47 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.147 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.150 187156 DEBUG nova.compute.manager [None req-7f2b4acb-0057-47d7-8288-a4b2615caddb 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.153 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[275e219e-958d-4fa0-bbc6-66e39414ed17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237735, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.171 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[38340862-a111-4c34-baf3-aa151365bf77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:7e40'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662200, 'tstamp': 662200}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237736, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.179 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.183 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.192 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab31361-8307-452d-8ec0-e9554b6ad582]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237737, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.226 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.226 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401159.1500487, 220b7865-2248-43ba-865a-b2314b5a6e47 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.226 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Started (Lifecycle Event)#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.226 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1708ebdd-0f7e-4ed9-bcb2-b724ffc658e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.255 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.259 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.301 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eaaa81fb-9f55-44e1-8abe-e14f2f67fd02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.303 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.304 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.304 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.307 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:59 np0005539504 NetworkManager[55210]: <info>  [1764401159.3084] manager: (tap240f16d8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Nov 29 02:25:59 np0005539504 kernel: tap240f16d8-60: entered promiscuous mode
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.319 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.321 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:59 np0005539504 ovn_controller[95182]: 2025-11-29T07:25:59Z|00506|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.322 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.330 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.331 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bc6675e9-edfd-4727-97fe-f91f42168065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.333 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:25:59.334 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'env', 'PROCESS_TAG=haproxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/240f16d8-602b-4aa1-8edb-e3a8d3674e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:59 np0005539504 nova_compute[187152]: 2025-11-29 07:25:59.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:59 np0005539504 podman[237769]: 2025-11-29 07:25:59.675333378 +0000 UTC m=+0.024165168 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:26:00 np0005539504 podman[237769]: 2025-11-29 07:26:00.133262847 +0000 UTC m=+0.482094617 container create 0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:26:00 np0005539504 systemd[1]: Started libpod-conmon-0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b.scope.
Nov 29 02:26:00 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:26:00 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1aa9c211fe7319cf8b1803d72b0251a9ea6c6464f3137dd58a4de53bdac7a8d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:26:00 np0005539504 podman[237769]: 2025-11-29 07:26:00.250241561 +0000 UTC m=+0.599073351 container init 0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:26:00 np0005539504 podman[237769]: 2025-11-29 07:26:00.257199828 +0000 UTC m=+0.606031598 container start 0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:26:00 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [NOTICE]   (237790) : New worker (237792) forked
Nov 29 02:26:00 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [NOTICE]   (237790) : Loading success.
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.051 187156 DEBUG nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.052 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.052 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.052 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.052 187156 DEBUG nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.053 187156 WARNING nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state None.#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.053 187156 DEBUG nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.053 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.053 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.053 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.054 187156 DEBUG nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.054 187156 WARNING nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state None.#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.054 187156 DEBUG nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.054 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.054 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.054 187156 DEBUG oslo_concurrency.lockutils [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.055 187156 DEBUG nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.055 187156 WARNING nova.compute.manager [req-0819aede-0f4b-404d-ac10-ffc8fc45dc29 req-acd2024e-d264-46e0-ae31-d26d8750946a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state None.#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.266 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:01 np0005539504 nova_compute[187152]: 2025-11-29 07:26:01.987 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.092 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.092 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.128 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.268 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.269 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.277 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.277 187156 INFO nova.compute.claims [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.451 187156 DEBUG nova.compute.provider_tree [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.469 187156 DEBUG nova.scheduler.client.report [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.648 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.649 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.790 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.790 187156 DEBUG nova.network.neutron [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.819 187156 INFO nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.843 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.981 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.982 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.983 187156 INFO nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Creating image(s)#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.984 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.984 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.985 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:04 np0005539504 nova_compute[187152]: 2025-11-29 07:26:04.999 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.055 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.056 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.057 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.068 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.117 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.118 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.151 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.152 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.153 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.209 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.210 187156 DEBUG nova.virt.disk.api [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Checking if we can resize image /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.211 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.270 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.271 187156 DEBUG nova.virt.disk.api [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Cannot resize image /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.271 187156 DEBUG nova.objects.instance [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'migration_context' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.288 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.288 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Ensure instance console log exists: /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.289 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.289 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.289 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.574 187156 DEBUG nova.policy [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:26:05 np0005539504 nova_compute[187152]: 2025-11-29 07:26:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:06 np0005539504 nova_compute[187152]: 2025-11-29 07:26:06.260 187156 DEBUG nova.network.neutron [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Successfully created port: e473974d-ee78-43f1-9e3a-6baa18151417 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:26:06 np0005539504 nova_compute[187152]: 2025-11-29 07:26:06.269 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:06 np0005539504 podman[237817]: 2025-11-29 07:26:06.770059788 +0000 UTC m=+0.083353633 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 29 02:26:06 np0005539504 nova_compute[187152]: 2025-11-29 07:26:06.952 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401151.9503036, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:06 np0005539504 nova_compute[187152]: 2025-11-29 07:26:06.953 187156 INFO nova.compute.manager [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:26:06 np0005539504 nova_compute[187152]: 2025-11-29 07:26:06.977 187156 DEBUG nova.compute.manager [None req-77c58762-4037-4186-9295-4b0e651310d0 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:06 np0005539504 nova_compute[187152]: 2025-11-29 07:26:06.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.518 187156 DEBUG nova.network.neutron [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Successfully updated port: e473974d-ee78-43f1-9e3a-6baa18151417 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.558 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.559 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.559 187156 DEBUG nova.network.neutron [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.648 187156 DEBUG nova.compute.manager [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-changed-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.649 187156 DEBUG nova.compute.manager [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Refreshing instance network info cache due to event network-changed-e473974d-ee78-43f1-9e3a-6baa18151417. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.649 187156 DEBUG oslo_concurrency.lockutils [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:07 np0005539504 nova_compute[187152]: 2025-11-29 07:26:07.843 187156 DEBUG nova.network.neutron [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:26:08 np0005539504 nova_compute[187152]: 2025-11-29 07:26:08.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.774 187156 DEBUG nova.network.neutron [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updating instance_info_cache with network_info: [{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.802 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.802 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance network_info: |[{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.803 187156 DEBUG oslo_concurrency.lockutils [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.804 187156 DEBUG nova.network.neutron [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Refreshing network info cache for port e473974d-ee78-43f1-9e3a-6baa18151417 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.809 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Start _get_guest_xml network_info=[{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.816 187156 WARNING nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.822 187156 DEBUG nova.virt.libvirt.host [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.824 187156 DEBUG nova.virt.libvirt.host [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.828 187156 DEBUG nova.virt.libvirt.host [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.829 187156 DEBUG nova.virt.libvirt.host [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.831 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.832 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.833 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.833 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.834 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.834 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.835 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.836 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.836 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.837 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.837 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.838 187156 DEBUG nova.virt.hardware [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.845 187156 DEBUG nova.virt.libvirt.vif [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:26:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1152076978',display_name='tempest-ServerStableDeviceRescueTest-server-1152076978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1152076978',id=130,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-ujcqh7p9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:26:04Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=f63eb3ac-909c-46ed-b7ee-2e4f04b0998c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.846 187156 DEBUG nova.network.os_vif_util [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.848 187156 DEBUG nova.network.os_vif_util [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.849 187156 DEBUG nova.objects.instance [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.866 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <uuid>f63eb3ac-909c-46ed-b7ee-2e4f04b0998c</uuid>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <name>instance-00000082</name>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1152076978</nova:name>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:26:09</nova:creationTime>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:user uuid="5be41a8530314f83bbecbb74b9276f2d">tempest-ServerStableDeviceRescueTest-2012111838-project-member</nova:user>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:project uuid="ac3bb322fa744e099b38e08abe12d0e2">tempest-ServerStableDeviceRescueTest-2012111838</nova:project>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        <nova:port uuid="e473974d-ee78-43f1-9e3a-6baa18151417">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <entry name="serial">f63eb3ac-909c-46ed-b7ee-2e4f04b0998c</entry>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <entry name="uuid">f63eb3ac-909c-46ed-b7ee-2e4f04b0998c</entry>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:71:58:b9"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <target dev="tape473974d-ee"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/console.log" append="off"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:26:09 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:26:09 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:26:09 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:26:09 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.868 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Preparing to wait for external event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.868 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.868 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.869 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.870 187156 DEBUG nova.virt.libvirt.vif [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:26:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1152076978',display_name='tempest-ServerStableDeviceRescueTest-server-1152076978',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1152076978',id=130,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-ujcqh7p9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:26:04Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=f63eb3ac-909c-46ed-b7ee-2e4f04b0998c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.870 187156 DEBUG nova.network.os_vif_util [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.871 187156 DEBUG nova.network.os_vif_util [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.871 187156 DEBUG os_vif [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.872 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.873 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.873 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.877 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.878 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape473974d-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.878 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape473974d-ee, col_values=(('external_ids', {'iface-id': 'e473974d-ee78-43f1-9e3a-6baa18151417', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:58:b9', 'vm-uuid': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.903 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539504 NetworkManager[55210]: <info>  [1764401169.9044] manager: (tape473974d-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.906 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.911 187156 INFO os_vif [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee')#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.983 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.984 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.984 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No VIF found with MAC fa:16:3e:71:58:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:26:09 np0005539504 nova_compute[187152]: 2025-11-29 07:26:09.985 187156 INFO nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Using config drive#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.556 187156 INFO nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Creating config drive at /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.562 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn8t7rhb0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.685 187156 DEBUG oslo_concurrency.processutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn8t7rhb0" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:10 np0005539504 kernel: tape473974d-ee: entered promiscuous mode
Nov 29 02:26:10 np0005539504 NetworkManager[55210]: <info>  [1764401170.7708] manager: (tape473974d-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/235)
Nov 29 02:26:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:10Z|00507|binding|INFO|Claiming lport e473974d-ee78-43f1-9e3a-6baa18151417 for this chassis.
Nov 29 02:26:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:10Z|00508|binding|INFO|e473974d-ee78-43f1-9e3a-6baa18151417: Claiming fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.784 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:58:b9 10.100.0.13'], port_security=['fa:16:3e:71:58:b9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e473974d-ee78-43f1-9e3a-6baa18151417) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.785 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e473974d-ee78-43f1-9e3a-6baa18151417 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.788 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:26:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:10Z|00509|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 ovn-installed in OVS
Nov 29 02:26:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:10Z|00510|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 up in Southbound
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.798 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:10 np0005539504 systemd-udevd[237859]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.807 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7b37dcb9-979c-4bb6-af66-19a119ec55c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:10 np0005539504 systemd-machined[153423]: New machine qemu-67-instance-00000082.
Nov 29 02:26:10 np0005539504 NetworkManager[55210]: <info>  [1764401170.8336] device (tape473974d-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:26:10 np0005539504 systemd[1]: Started Virtual Machine qemu-67-instance-00000082.
Nov 29 02:26:10 np0005539504 NetworkManager[55210]: <info>  [1764401170.8371] device (tape473974d-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.839 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0924638b-562b-49b4-850d-9051021a6486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.843 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[27666896-1b54-4677-a7f4-a264435cfa1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.875 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa6eaa8-e97c-422a-a0aa-3fb62feeb9ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.896 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[563bacd3-98ed-407a-ad0b-231456eecc0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237875, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.912 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a80b33d8-f410-48ac-845b-475b8aab5168]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662213, 'tstamp': 662213}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237876, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662217, 'tstamp': 662217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237876, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.914 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.915 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.916 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.919 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.919 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.919 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:10.920 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:10 np0005539504 nova_compute[187152]: 2025-11-29 07:26:10.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.272 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.695 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401171.6944294, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.696 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.908 187156 DEBUG nova.compute.manager [req-cc98df1f-745c-4021-9511-42f075f36a5e req-61faab51-b0d5-4d62-8bca-60eb96e8143d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.909 187156 DEBUG oslo_concurrency.lockutils [req-cc98df1f-745c-4021-9511-42f075f36a5e req-61faab51-b0d5-4d62-8bca-60eb96e8143d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.910 187156 DEBUG oslo_concurrency.lockutils [req-cc98df1f-745c-4021-9511-42f075f36a5e req-61faab51-b0d5-4d62-8bca-60eb96e8143d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.911 187156 DEBUG oslo_concurrency.lockutils [req-cc98df1f-745c-4021-9511-42f075f36a5e req-61faab51-b0d5-4d62-8bca-60eb96e8143d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.911 187156 DEBUG nova.compute.manager [req-cc98df1f-745c-4021-9511-42f075f36a5e req-61faab51-b0d5-4d62-8bca-60eb96e8143d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Processing event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.913 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.917 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.922 187156 INFO nova.virt.libvirt.driver [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance spawned successfully.#033[00m
Nov 29 02:26:11 np0005539504 nova_compute[187152]: 2025-11-29 07:26:11.922 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.404 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.407 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:26:12 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:12Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:26:12 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:12Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:25:aa 10.100.0.4
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.678 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.679 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.680 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.680 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.681 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.681 187156 DEBUG nova.virt.libvirt.driver [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.730 187156 DEBUG nova.network.neutron [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updated VIF entry in instance network info cache for port e473974d-ee78-43f1-9e3a-6baa18151417. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:26:12 np0005539504 nova_compute[187152]: 2025-11-29 07:26:12.731 187156 DEBUG nova.network.neutron [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updating instance_info_cache with network_info: [{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.479 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.480 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401171.6959486, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.480 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.825 187156 DEBUG oslo_concurrency.lockutils [req-027b5438-e5e5-4844-b6f4-4b04a755f71d req-7215372e-a83f-4cab-84c5-c8b874b73556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.903 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.918 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.928 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401171.9166243, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.928 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:26:14 np0005539504 nova_compute[187152]: 2025-11-29 07:26:14.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:15 np0005539504 nova_compute[187152]: 2025-11-29 07:26:15.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:16 np0005539504 nova_compute[187152]: 2025-11-29 07:26:16.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:17 np0005539504 podman[237886]: 2025-11-29 07:26:17.741244216 +0000 UTC m=+0.077360233 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:26:17 np0005539504 podman[237887]: 2025-11-29 07:26:17.745210703 +0000 UTC m=+0.069746800 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Nov 29 02:26:17 np0005539504 podman[237888]: 2025-11-29 07:26:17.781214578 +0000 UTC m=+0.094294848 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:26:19 np0005539504 nova_compute[187152]: 2025-11-29 07:26:19.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.010 187156 DEBUG nova.compute.manager [req-d74a9d98-58ab-48be-a3db-a754a6710f55 req-68be48a8-9894-464f-8d28-d6e397dfe903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.011 187156 DEBUG oslo_concurrency.lockutils [req-d74a9d98-58ab-48be-a3db-a754a6710f55 req-68be48a8-9894-464f-8d28-d6e397dfe903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.011 187156 DEBUG oslo_concurrency.lockutils [req-d74a9d98-58ab-48be-a3db-a754a6710f55 req-68be48a8-9894-464f-8d28-d6e397dfe903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.012 187156 DEBUG oslo_concurrency.lockutils [req-d74a9d98-58ab-48be-a3db-a754a6710f55 req-68be48a8-9894-464f-8d28-d6e397dfe903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.012 187156 DEBUG nova.compute.manager [req-d74a9d98-58ab-48be-a3db-a754a6710f55 req-68be48a8-9894-464f-8d28-d6e397dfe903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.013 187156 WARNING nova.compute.manager [req-d74a9d98-58ab-48be-a3db-a754a6710f55 req-68be48a8-9894-464f-8d28-d6e397dfe903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.054 187156 INFO nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Took 15.07 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.054 187156 DEBUG nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.072 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.072 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.073 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.073 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.081 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:20 np0005539504 nova_compute[187152]: 2025-11-29 07:26:20.088 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:26:21 np0005539504 nova_compute[187152]: 2025-11-29 07:26:21.312 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:22 np0005539504 nova_compute[187152]: 2025-11-29 07:26:22.662 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:26:22 np0005539504 podman[237955]: 2025-11-29 07:26:22.815945166 +0000 UTC m=+0.154395947 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:26:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:22.966 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:22.968 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:22.969 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:23 np0005539504 podman[237982]: 2025-11-29 07:26:23.769487953 +0000 UTC m=+0.112058253 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:26:24 np0005539504 nova_compute[187152]: 2025-11-29 07:26:24.881 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:25 np0005539504 nova_compute[187152]: 2025-11-29 07:26:25.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.146 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json" returned: 0 in 1.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.147 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.226 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.238 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.323 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.325 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.357 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.411 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.418 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.493 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.495 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.558 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.763 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.764 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5183MB free_disk=73.09905624389648GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.764 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:26 np0005539504 nova_compute[187152]: 2025-11-29 07:26:26.764 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.620 187156 INFO nova.compute.manager [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Took 25.39 seconds to build instance.#033[00m
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.695 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance a047dabb-8e55-4bea-92aa-20b191da7b54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.695 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 220b7865-2248-43ba-865a-b2314b5a6e47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.695 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance f63eb3ac-909c-46ed-b7ee-2e4f04b0998c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.696 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.696 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:26:29 np0005539504 podman[238038]: 2025-11-29 07:26:29.749417716 +0000 UTC m=+0.080246951 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:26:29 np0005539504 nova_compute[187152]: 2025-11-29 07:26:29.922 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:26:30 np0005539504 nova_compute[187152]: 2025-11-29 07:26:30.083 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:30 np0005539504 nova_compute[187152]: 2025-11-29 07:26:30.746 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:26:30 np0005539504 nova_compute[187152]: 2025-11-29 07:26:30.838 187156 DEBUG oslo_concurrency.lockutils [None req-b3ff39f7-0d50-4538-8c72-253425c7f9a8 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.746s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:31 np0005539504 nova_compute[187152]: 2025-11-29 07:26:31.352 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:33 np0005539504 nova_compute[187152]: 2025-11-29 07:26:33.356 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:26:33 np0005539504 nova_compute[187152]: 2025-11-29 07:26:33.357 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:34 np0005539504 nova_compute[187152]: 2025-11-29 07:26:34.358 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:35 np0005539504 nova_compute[187152]: 2025-11-29 07:26:35.099 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:35 np0005539504 nova_compute[187152]: 2025-11-29 07:26:35.287 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:35 np0005539504 nova_compute[187152]: 2025-11-29 07:26:35.287 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:26:35 np0005539504 nova_compute[187152]: 2025-11-29 07:26:35.287 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:26:36 np0005539504 nova_compute[187152]: 2025-11-29 07:26:36.229 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:36 np0005539504 nova_compute[187152]: 2025-11-29 07:26:36.230 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:36 np0005539504 nova_compute[187152]: 2025-11-29 07:26:36.230 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:26:36 np0005539504 nova_compute[187152]: 2025-11-29 07:26:36.231 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid a047dabb-8e55-4bea-92aa-20b191da7b54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:36Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:26:36 np0005539504 nova_compute[187152]: 2025-11-29 07:26:36.387 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:36Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:26:37 np0005539504 nova_compute[187152]: 2025-11-29 07:26:37.406 187156 DEBUG nova.compute.manager [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:37 np0005539504 podman[238066]: 2025-11-29 07:26:37.742830473 +0000 UTC m=+0.073593722 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:26:37 np0005539504 nova_compute[187152]: 2025-11-29 07:26:37.797 187156 INFO nova.compute.manager [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] instance snapshotting#033[00m
Nov 29 02:26:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:37.841 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:37.847 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:26:37 np0005539504 nova_compute[187152]: 2025-11-29 07:26:37.849 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.340 187156 INFO nova.virt.libvirt.driver [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Beginning live snapshot process#033[00m
Nov 29 02:26:38 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.723 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.807 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json -f qcow2" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.808 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:38.851 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.866 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.891 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.950 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:38 np0005539504 nova_compute[187152]: 2025-11-29 07:26:38.952 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnocgvper/4b36428dde9642aebb0613e6ce2c7d6b.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:39 np0005539504 nova_compute[187152]: 2025-11-29 07:26:39.138 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnocgvper/4b36428dde9642aebb0613e6ce2c7d6b.delta 1073741824" returned: 0 in 0.186s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:39 np0005539504 nova_compute[187152]: 2025-11-29 07:26:39.141 187156 INFO nova.virt.libvirt.driver [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:26:39 np0005539504 nova_compute[187152]: 2025-11-29 07:26:39.204 187156 DEBUG nova.virt.libvirt.guest [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:26:39 np0005539504 nova_compute[187152]: 2025-11-29 07:26:39.708 187156 DEBUG nova.virt.libvirt.guest [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:26:39 np0005539504 nova_compute[187152]: 2025-11-29 07:26:39.712 187156 INFO nova.virt.libvirt.driver [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:26:40 np0005539504 nova_compute[187152]: 2025-11-29 07:26:40.102 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:40 np0005539504 nova_compute[187152]: 2025-11-29 07:26:40.210 187156 DEBUG nova.privsep.utils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:26:40 np0005539504 nova_compute[187152]: 2025-11-29 07:26:40.211 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnocgvper/4b36428dde9642aebb0613e6ce2c7d6b.delta /var/lib/nova/instances/snapshots/tmpnocgvper/4b36428dde9642aebb0613e6ce2c7d6b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:41 np0005539504 nova_compute[187152]: 2025-11-29 07:26:41.391 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:41 np0005539504 nova_compute[187152]: 2025-11-29 07:26:41.816 187156 DEBUG oslo_concurrency.processutils [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnocgvper/4b36428dde9642aebb0613e6ce2c7d6b.delta /var/lib/nova/instances/snapshots/tmpnocgvper/4b36428dde9642aebb0613e6ce2c7d6b" returned: 0 in 1.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:41 np0005539504 nova_compute[187152]: 2025-11-29 07:26:41.823 187156 INFO nova.virt.libvirt.driver [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:26:43 np0005539504 nova_compute[187152]: 2025-11-29 07:26:43.340 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:43 np0005539504 nova_compute[187152]: 2025-11-29 07:26:43.367 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:43 np0005539504 nova_compute[187152]: 2025-11-29 07:26:43.368 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:26:43 np0005539504 nova_compute[187152]: 2025-11-29 07:26:43.368 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:43 np0005539504 nova_compute[187152]: 2025-11-29 07:26:43.368 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.296 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.297 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.297 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.298 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.298 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.311 187156 INFO nova.compute.manager [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Terminating instance#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.323 187156 DEBUG nova.compute.manager [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:26:44 np0005539504 kernel: tap117c88c8-f8 (unregistering): left promiscuous mode
Nov 29 02:26:44 np0005539504 NetworkManager[55210]: <info>  [1764401204.3566] device (tap117c88c8-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:26:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:44Z|00511|binding|INFO|Releasing lport 117c88c8-f8df-49f6-aa22-1c554973f1ad from this chassis (sb_readonly=0)
Nov 29 02:26:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:44Z|00512|binding|INFO|Setting lport 117c88c8-f8df-49f6-aa22-1c554973f1ad down in Southbound
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.364 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:44Z|00513|binding|INFO|Removing iface tap117c88c8-f8 ovn-installed in OVS
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.366 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.377 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:ae:fa 10.100.0.5'], port_security=['fa:16:3e:6f:ae:fa 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a047dabb-8e55-4bea-92aa-20b191da7b54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5c6bb94-c536-451b-a4cb-db984bf0cbdf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=117c88c8-f8df-49f6-aa22-1c554973f1ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.379 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.379 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 117c88c8-f8df-49f6-aa22-1c554973f1ad in datapath ae86c83f-be5a-4cd0-9064-11898ee2fcef unbound from our chassis#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.381 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae86c83f-be5a-4cd0-9064-11898ee2fcef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.383 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[832d26c0-7616-4335-a3ee-f4435cea55a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.384 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef namespace which is not needed anymore#033[00m
Nov 29 02:26:44 np0005539504 kernel: tapab95b3bf-94 (unregistering): left promiscuous mode
Nov 29 02:26:44 np0005539504 NetworkManager[55210]: <info>  [1764401204.3970] device (tapab95b3bf-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:26:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:44Z|00514|binding|INFO|Releasing lport ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 from this chassis (sb_readonly=0)
Nov 29 02:26:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:44Z|00515|binding|INFO|Setting lport ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 down in Southbound
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.406 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:44Z|00516|binding|INFO|Removing iface tapab95b3bf-94 ovn-installed in OVS
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.409 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.424 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.421 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:0e:3d 2001:db8::f816:3eff:fe0d:e3d'], port_security=['fa:16:3e:0d:0e:3d 2001:db8::f816:3eff:fe0d:e3d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe0d:e3d/64', 'neutron:device_id': 'a047dabb-8e55-4bea-92aa-20b191da7b54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3d94aff-5439-43d3-a356-7aafae582344', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=890f979e-778b-42a4-aff1-be3795cfb05f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:44 np0005539504 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Nov 29 02:26:44 np0005539504 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007c.scope: Consumed 17.980s CPU time.
Nov 29 02:26:44 np0005539504 systemd-machined[153423]: Machine qemu-62-instance-0000007c terminated.
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [NOTICE]   (236514) : haproxy version is 2.8.14-c23fe91
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [NOTICE]   (236514) : path to executable is /usr/sbin/haproxy
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [WARNING]  (236514) : Exiting Master process...
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [WARNING]  (236514) : Exiting Master process...
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [ALERT]    (236514) : Current worker (236516) exited with code 143 (Terminated)
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[236510]: [WARNING]  (236514) : All workers exited. Exiting... (0)
Nov 29 02:26:44 np0005539504 systemd[1]: libpod-0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23.scope: Deactivated successfully.
Nov 29 02:26:44 np0005539504 podman[238144]: 2025-11-29 07:26:44.53705129 +0000 UTC m=+0.055307283 container died 0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:26:44 np0005539504 NetworkManager[55210]: <info>  [1764401204.5604] manager: (tapab95b3bf-94): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Nov 29 02:26:44 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23-userdata-shm.mount: Deactivated successfully.
Nov 29 02:26:44 np0005539504 systemd[1]: var-lib-containers-storage-overlay-d27864e328470048392ae6b197bb2117a006b53ebef2ee204cd193f0ca4074a2-merged.mount: Deactivated successfully.
Nov 29 02:26:44 np0005539504 podman[238144]: 2025-11-29 07:26:44.580888974 +0000 UTC m=+0.099144967 container cleanup 0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:26:44 np0005539504 systemd[1]: libpod-conmon-0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23.scope: Deactivated successfully.
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.616 187156 INFO nova.virt.libvirt.driver [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance destroyed successfully.#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.617 187156 DEBUG nova.objects.instance [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid a047dabb-8e55-4bea-92aa-20b191da7b54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.632 187156 DEBUG nova.virt.libvirt.vif [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-107191372',display_name='tempest-TestGettingAddress-server-107191372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-107191372',id=124,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-umysq0th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:06Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=a047dabb-8e55-4bea-92aa-20b191da7b54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.633 187156 DEBUG nova.network.os_vif_util [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.634 187156 DEBUG nova.network.os_vif_util [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.634 187156 DEBUG os_vif [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.636 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap117c88c8-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.648 187156 INFO os_vif [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ae:fa,bridge_name='br-int',has_traffic_filtering=True,id=117c88c8-f8df-49f6-aa22-1c554973f1ad,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap117c88c8-f8')#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.649 187156 DEBUG nova.virt.libvirt.vif [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-107191372',display_name='tempest-TestGettingAddress-server-107191372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-107191372',id=124,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-umysq0th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:06Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=a047dabb-8e55-4bea-92aa-20b191da7b54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.650 187156 DEBUG nova.network.os_vif_util [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.651 187156 DEBUG nova.network.os_vif_util [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.652 187156 DEBUG os_vif [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.655 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab95b3bf-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 podman[238197]: 2025-11-29 07:26:44.657634801 +0000 UTC m=+0.047647018 container remove 0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.659 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.662 187156 INFO os_vif [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:0e:3d,bridge_name='br-int',has_traffic_filtering=True,id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab95b3bf-94')#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.663 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6ea941-f9e7-48b2-b588-07262b158ce7]: (4, ('Sat Nov 29 07:26:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef (0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23)\n0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23\nSat Nov 29 07:26:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef (0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23)\n0af3e9162d7f7086bf6f1497c76e30d3e19a9a61db009620f2b2fae600c7de23\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.664 187156 INFO nova.virt.libvirt.driver [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Deleting instance files /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54_del#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.665 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fcea64a4-95c9-4563-b8cd-5cce54eb2657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.665 187156 INFO nova.virt.libvirt.driver [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Deletion of /var/lib/nova/instances/a047dabb-8e55-4bea-92aa-20b191da7b54_del complete#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.666 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae86c83f-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:44 np0005539504 kernel: tapae86c83f-b0: left promiscuous mode
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.670 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 nova_compute[187152]: 2025-11-29 07:26:44.689 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.693 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0c1b5c-0655-428e-a844-c120e32f430a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.710 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8aab953d-f595-442b-84f4-04baaf52e7df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.712 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8842ceaa-23ee-4926-887e-710ff2e22a12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.730 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4fae736d-30ce-4e93-a797-0559587432e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656876, 'reachable_time': 15447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238219, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.734 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.735 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[f018060d-3428-4d61-807a-5a97e14ca98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.736 104164 INFO neutron.agent.ovn.metadata.agent [-] Port ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 in datapath a3d94aff-5439-43d3-a356-7aafae582344 unbound from our chassis#033[00m
Nov 29 02:26:44 np0005539504 systemd[1]: run-netns-ovnmeta\x2dae86c83f\x2dbe5a\x2d4cd0\x2d9064\x2d11898ee2fcef.mount: Deactivated successfully.
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.740 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3d94aff-5439-43d3-a356-7aafae582344, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.741 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9b06ea21-127a-436b-b6cf-db542ca911c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:44.742 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 namespace which is not needed anymore#033[00m
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [NOTICE]   (236585) : haproxy version is 2.8.14-c23fe91
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [NOTICE]   (236585) : path to executable is /usr/sbin/haproxy
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [WARNING]  (236585) : Exiting Master process...
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [WARNING]  (236585) : Exiting Master process...
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [ALERT]    (236585) : Current worker (236587) exited with code 143 (Terminated)
Nov 29 02:26:44 np0005539504 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[236581]: [WARNING]  (236585) : All workers exited. Exiting... (0)
Nov 29 02:26:44 np0005539504 systemd[1]: libpod-a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da.scope: Deactivated successfully.
Nov 29 02:26:44 np0005539504 podman[238235]: 2025-11-29 07:26:44.946784228 +0000 UTC m=+0.121443895 container died a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:26:45 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da-userdata-shm.mount: Deactivated successfully.
Nov 29 02:26:45 np0005539504 systemd[1]: var-lib-containers-storage-overlay-38cfa709225e42a9452ba85f8dd7070023ccfa6d1e86e17544f3e6cf9f15ed91-merged.mount: Deactivated successfully.
Nov 29 02:26:45 np0005539504 podman[238235]: 2025-11-29 07:26:45.022357452 +0000 UTC m=+0.197017119 container cleanup a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:26:45 np0005539504 systemd[1]: libpod-conmon-a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da.scope: Deactivated successfully.
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.047 187156 INFO nova.compute.manager [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.048 187156 DEBUG oslo.service.loopingcall [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.048 187156 DEBUG nova.compute.manager [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.049 187156 DEBUG nova.network.neutron [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:26:45 np0005539504 podman[238266]: 2025-11-29 07:26:45.087122248 +0000 UTC m=+0.042460400 container remove a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.093 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0e1b97-72f0-4a2d-8861-7cb732b4075e]: (4, ('Sat Nov 29 07:26:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 (a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da)\na8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da\nSat Nov 29 07:26:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 (a8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da)\na8dadde18891350fe1d9b8c91da3d99191cbb3f681ce5115855ad31d9b17e2da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.096 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bcf86edf-3aca-440e-b1af-e4be33cfe98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.098 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3d94aff-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:45 np0005539504 kernel: tapa3d94aff-50: left promiscuous mode
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.101 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.107 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50fedfd4-e008-4e69-8790-970008e2baa4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.115 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.123 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a900c854-373b-4b45-a14a-139b4a2f21f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.125 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[85e2468f-4076-4c53-9354-4dff0ee09fe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.144 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b3eeae-8582-414b-85d5-c2ddc4022ca3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 656967, 'reachable_time': 22808, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238281, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.147 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:26:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:45.147 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[5d26387c-4013-43d3-abc4-e3c92bf81e2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.202 187156 DEBUG nova.compute.manager [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-changed-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.203 187156 DEBUG nova.compute.manager [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing instance network info cache due to event network-changed-117c88c8-f8df-49f6-aa22-1c554973f1ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.203 187156 DEBUG oslo_concurrency.lockutils [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.203 187156 DEBUG oslo_concurrency.lockutils [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.204 187156 DEBUG nova.network.neutron [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Refreshing network info cache for port 117c88c8-f8df-49f6-aa22-1c554973f1ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.366 187156 DEBUG nova.compute.manager [req-56bd3cbf-cff0-48a1-8d84-6b2b5a3ae563 req-aa229163-5dfa-4f36-9400-11548ccd9af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-unplugged-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.367 187156 DEBUG oslo_concurrency.lockutils [req-56bd3cbf-cff0-48a1-8d84-6b2b5a3ae563 req-aa229163-5dfa-4f36-9400-11548ccd9af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.367 187156 DEBUG oslo_concurrency.lockutils [req-56bd3cbf-cff0-48a1-8d84-6b2b5a3ae563 req-aa229163-5dfa-4f36-9400-11548ccd9af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.367 187156 DEBUG oslo_concurrency.lockutils [req-56bd3cbf-cff0-48a1-8d84-6b2b5a3ae563 req-aa229163-5dfa-4f36-9400-11548ccd9af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.367 187156 DEBUG nova.compute.manager [req-56bd3cbf-cff0-48a1-8d84-6b2b5a3ae563 req-aa229163-5dfa-4f36-9400-11548ccd9af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] No waiting events found dispatching network-vif-unplugged-117c88c8-f8df-49f6-aa22-1c554973f1ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.368 187156 DEBUG nova.compute.manager [req-56bd3cbf-cff0-48a1-8d84-6b2b5a3ae563 req-aa229163-5dfa-4f36-9400-11548ccd9af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-unplugged-117c88c8-f8df-49f6-aa22-1c554973f1ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:26:45 np0005539504 systemd[1]: run-netns-ovnmeta\x2da3d94aff\x2d5439\x2d43d3\x2da356\x2d7aafae582344.mount: Deactivated successfully.
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.773 187156 INFO nova.virt.libvirt.driver [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Snapshot image upload complete#033[00m
Nov 29 02:26:45 np0005539504 nova_compute[187152]: 2025-11-29 07:26:45.773 187156 INFO nova.compute.manager [None req-985c786a-7aa0-40fb-bc81-3f85a6a28063 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Took 7.97 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:26:46 np0005539504 nova_compute[187152]: 2025-11-29 07:26:46.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.520 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-unplugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.521 187156 DEBUG oslo_concurrency.lockutils [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.521 187156 DEBUG oslo_concurrency.lockutils [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.522 187156 DEBUG oslo_concurrency.lockutils [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.522 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] No waiting events found dispatching network-vif-unplugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.522 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-unplugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.522 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.523 187156 DEBUG oslo_concurrency.lockutils [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.523 187156 DEBUG oslo_concurrency.lockutils [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.523 187156 DEBUG oslo_concurrency.lockutils [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.524 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] No waiting events found dispatching network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.524 187156 WARNING nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received unexpected event network-vif-plugged-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.524 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-deleted-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.525 187156 INFO nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Neutron deleted interface 117c88c8-f8df-49f6-aa22-1c554973f1ad; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.526 187156 DEBUG nova.network.neutron [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.558 187156 DEBUG nova.compute.manager [req-bf439f6b-2b35-4927-81ca-af8454bb09f8 req-45d773a2-c677-43e5-b140-5a72957cb8e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Detach interface failed, port_id=117c88c8-f8df-49f6-aa22-1c554973f1ad, reason: Instance a047dabb-8e55-4bea-92aa-20b191da7b54 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.654 187156 DEBUG nova.compute.manager [req-b0ebf5de-58bc-40eb-a994-535793e073fd req-c94068fb-dbe4-43e3-ada5-df057bd2f1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.655 187156 DEBUG oslo_concurrency.lockutils [req-b0ebf5de-58bc-40eb-a994-535793e073fd req-c94068fb-dbe4-43e3-ada5-df057bd2f1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.655 187156 DEBUG oslo_concurrency.lockutils [req-b0ebf5de-58bc-40eb-a994-535793e073fd req-c94068fb-dbe4-43e3-ada5-df057bd2f1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.656 187156 DEBUG oslo_concurrency.lockutils [req-b0ebf5de-58bc-40eb-a994-535793e073fd req-c94068fb-dbe4-43e3-ada5-df057bd2f1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.656 187156 DEBUG nova.compute.manager [req-b0ebf5de-58bc-40eb-a994-535793e073fd req-c94068fb-dbe4-43e3-ada5-df057bd2f1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] No waiting events found dispatching network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:47 np0005539504 nova_compute[187152]: 2025-11-29 07:26:47.656 187156 WARNING nova.compute.manager [req-b0ebf5de-58bc-40eb-a994-535793e073fd req-c94068fb-dbe4-43e3-ada5-df057bd2f1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received unexpected event network-vif-plugged-117c88c8-f8df-49f6-aa22-1c554973f1ad for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:26:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:47.976 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000082', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'hostId': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:26:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:47.979 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'hostId': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:26:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:47.979 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.012 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.014 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.021 187156 DEBUG nova.network.neutron [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.043 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.write.requests volume: 37 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.044 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ebad560-f707-4bd0-b830-55ed9e3532bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:47.979899', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f4409e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': 'ee96ede778c5c5b489ceef2808980768ae071072daeaaedd38d5e4f94078d274'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:47.979899', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f46416-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': 'e83634dc0c7c0454609efa41aa3d602ccb5d4a59c6383f2265f417d73583ef63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 37, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:47.979899', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3f8eb9e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': '33bbdd462fdb7730ed30ddddd8b33f694d3713326aa08bf220dc78d94c956caf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:47.979899', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3f8ff80-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': 'bbb2995cb1cb3d04e039e463071d89fc17fd37594edf67200782aa5399720bd3'}]}, 'timestamp': '2025-11-29 07:26:48.044960', '_unique_id': '766019b9af8c4706939aa4faca4b2f5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.051 187156 INFO nova.compute.manager [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Took 3.00 seconds to deallocate network for instance.#033[00m
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.062 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.062 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.074 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.075 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '583e7e05-ec17-4af8-ab77-8847fb3b1017', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.050570', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3fbb8c4-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.985294705, 'message_signature': '8bc7ab00efbfaa4bfd83964741d7cb888dd5c1130891be219d50546215b714ee'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.050570', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3fbc7d8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.985294705, 'message_signature': '04ed30ad5f68a9409514f2b2f7fc67ff6f0ac6f2a22c279d2abe65b739d32d10'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.050570', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3fd969e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.997816041, 'message_signature': '6e04a114bab63b018f933e0f9c9d5bd48f4e1c6219aa6d5555f93a70e9c8e320'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.050570', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3fda2c4-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.997816041, 'message_signature': '8ff3aed875793a911c115e1d737410674aea801df9b4c5bc6291cb1660f5a30c'}]}, 'timestamp': '2025-11-29 07:26:48.075251', '_unique_id': '5edb0016baf445e5b3e09137f7a1ecbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.079 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f63eb3ac-909c-46ed-b7ee-2e4f04b0998c / tape473974d-ee inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.080 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.082 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 220b7865-2248-43ba-865a-b2314b5a6e47 / tap86264ec7-05 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.082 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd764932e-43ce-4dcd-a97b-cb2bb8030f5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.077294', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c3fe6dda-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': '04fb765531cca504a1c4dd34ee8aa29c264e3c229c537eefa71edc9a2711d97c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.077294', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c3fedacc-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': 'c4d3352bb920ba388585d5dbb878b2ca65c41a8b1b5c4ab5737a3b8fe0f67c61'}]}, 'timestamp': '2025-11-29 07:26:48.083295', '_unique_id': 'd80885d269554477b7b53a5a2d2d99f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.085 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.read.bytes volume: 30296576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.085 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.085 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.read.bytes volume: 32057344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.085 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f77acc5-ccbd-49df-9706-f53e31ec93db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30296576, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.085195', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3ff31ac-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': '1c46bb430f8fd1412758e1d6758d317ae48c43b5e364fa8b34cef78233f651e6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.085195', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3ff3c4c-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': '069ca5525ec98ebce6f742fab3de712800afc90ae219d885270945849b447ec4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32057344, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.085195', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3ff4598-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': '79c36a330d35d9c43fca84af82946ba0d7adea8c9284c3581f05ca9847021926'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.085195', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3ff4f5c-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': 'b81bfc90e508bec05091964870adfd594086d0e86912bd3e3f84886cf2309c11'}]}, 'timestamp': '2025-11-29 07:26:48.086237', '_unique_id': '7866b297a5934038b025253647285916'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.087 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.087 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef199b02-e744-4386-8741-6a360aa6b92d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.087638', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c3ff9070-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': '9a80d2efcbb247fe21dc2b7720c788c338ebb6eb68c566525e94151d7d12da35'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.087638', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c3ff98e0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '0ff345faa8a5657756b60491bbd452a41b9d7468bfe26f2ef27cd7035ce6aa1f'}]}, 'timestamp': '2025-11-29 07:26:48.088105', '_unique_id': '4bbc68d9bb8d4981948b91146fc3c148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.089 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.089 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9993bc24-27e7-4947-b1d0-75fe31345020', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.089544', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c3ffdb48-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': 'ed006194a1b5463d488766f2969b803e5c3d5ad50eae7842068209bcb2373c7a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1520, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.089544', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c3ffe462-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '83038ced202eb8c9633f20137893612f9da73f6e684f8d8537c7bea2c704907a'}]}, 'timestamp': '2025-11-29 07:26:48.090021', '_unique_id': '5e8d2584133f4069a0e478c5a039d04d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.091 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.091 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fc234e2-07ee-4e20-8db3-915b9256f8e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.091470', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c40025f8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': 'ee396f3e61e34173038e4a2cb0bdbcfa94f0200541fd7749a4db5e6c092f22c1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.091470', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c4002ecc-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '306a21c92a4bb6950af481ab3e824d2696646aa8491692f462ed0fe2509efa50'}]}, 'timestamp': '2025-11-29 07:26:48.091925', '_unique_id': '7c09211606f34b4fa4114552d2febabc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.093 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.read.latency volume: 201616363 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.093 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.read.latency volume: 30730916 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.093 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.read.latency volume: 268138971 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.093 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.read.latency volume: 18112126 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c2db632-5396-4938-a63a-67efb2e9a8a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 201616363, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.093064', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4006414-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': '8b2130fda1da609238ac8664b76cc916af1f5eb776822e8d4d69f5b50020ff3e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30730916, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.093064', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4006cfc-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': '216104639134f351cad873acde8be1cb5101e5edc11ab59852bb8892771e19c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 268138971, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.093064', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4007512-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': 'ec05fd84d2eb859e29e97f36f549cc797a3bcd7855879cf3a604075f2e626a9a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18112126, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.093064', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4007cb0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': '150f56869adc68234e6945b420b64bc8ea89067f95a0b497740e88bba3dd0bc8'}]}, 'timestamp': '2025-11-29 07:26:48.093910', '_unique_id': 'a4efc94732904db08ecc02a09f98f641'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.110 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/memory.usage volume: 46.66015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.126 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/memory.usage volume: 42.26953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06949168-8279-4c41-b8da-3d1fee6745e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.66015625, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'timestamp': '2025-11-29T07:26:48.095140', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c4031a38-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.044884672, 'message_signature': 'ab44e4c52e3800ac9c357dfd99558af727399da1fa3e912446f59b9cf50c73b3'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.26953125, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'timestamp': '2025-11-29T07:26:48.095140', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c4059d58-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.061499888, 'message_signature': 'c2d9ab40d680e360ede6a3a1dc93eefc4375893ff18c796c1dad630493e5ec46'}]}, 'timestamp': '2025-11-29 07:26:48.127666', '_unique_id': '7c7c32cfe4274748ac22e04d967857de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.129 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.write.bytes volume: 72847360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.129 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.130 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.write.bytes volume: 339968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.130 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70c008f8-9078-477a-aeeb-2baddecad121', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72847360, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.129715', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c405fce4-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': 'f01678864fd8902575270490ace9d8f174642855f51c801fd72c1ad3e6e5a86b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.129715', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c40605ae-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': 'b9c162cb04d940ece1403922179f33a4488a001c19c9db45727f1442d332fd0e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 339968, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.129715', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4060f0e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': '7cefbe91b94a9bbf84c545b9d3e5858b63baac4d79a4493f6a9f479079a0d015'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.129715', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4061882-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': '85effaf09a3806d62378533d9af009bcd7191ff1bffde6e356982c171dfaa554'}]}, 'timestamp': '2025-11-29 07:26:48.130693', '_unique_id': '129c01d906cc49b3a4dbef995f0978e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.132 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.132 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12624f25-bd9e-4d98-a569-f4bab1236a00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.132402', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c4066634-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': '0ca3a71145b039b3ac381388b73ebae555324867069be751a3515aa0108418ce'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.132402', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c406703e-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '34547dc41fa48200b0bb166969082d5d890b84f0a9008d29a49a543992e3e05a'}]}, 'timestamp': '2025-11-29 07:26:48.132937', '_unique_id': 'e9fd3ba3a26e415d9185dc8ed92345a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.134 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.134 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>]
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.134 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.135 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.135 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.135 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a42a732d-ae99-4588-950b-ed69ec23cae2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.134746', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c406c16a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.985294705, 'message_signature': '02795cb4690feeea06322df997a4007a1401de0f68fb673c9212b962bb898d7b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.134746', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c406c9d0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.985294705, 'message_signature': '32f55b2910ae5faf4a51f16985c87fb9fbc23e7ff31d54246291550aff74b7a2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.134746', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c406d1a0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.997816041, 'message_signature': 'a3eb6c37df14d186d2c31b6a394109ede038c236e296dc6459a252fdb668453b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.134746', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c406dc68-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.997816041, 'message_signature': '0fbaf18765b7c450c469870a3b7294d53bcc3688b578239c4d19a6e017536f35'}]}, 'timestamp': '2025-11-29 07:26:48.135716', '_unique_id': 'ef2ef5945aac49bd869a7eff113499f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>]
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>]
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1152076978>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1803543286>]
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.138 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.138 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b3a6902-c20f-4759-b005-a8f8eda74f30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.138073', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c4074414-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': 'f4ab73d666fd7c62d132d1927982b8376c8f62a67465e2f42c33af571df394fd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.138073', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c4074e00-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '633e4ec9a61563fb44843a192ff0b6083a7192a5440637c783363742d68bd2f9'}]}, 'timestamp': '2025-11-29 07:26:48.138600', '_unique_id': 'a35d4a3180204d8c8ed84aea2e4eac24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.139 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.read.requests volume: 1088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.140 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.140 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.read.requests volume: 1213 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.140 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0270c283-e6dc-4113-800d-f5fb3d49fbef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1088, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.139781', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c40784d8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': 'f7667162bdf7a1482453ea1be9b59f64582653f3fd474fe7c14fbed0ebb2a284'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.139781', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4078cf8-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': 'bd78e4cba04da4a19057c9c0f3a8777a986b620519495946c87a219e023a96e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1213, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.139781', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4079540-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': '8328326460c94cc95b47bb81c51a1d0fc87335c1145d2c618e4fb5b981be61cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.139781', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4079e3c-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': 'f2a5989ee6c2714307ace5f342dfc4def9b95ef561e694b6f618b9ade8833628'}]}, 'timestamp': '2025-11-29 07:26:48.140645', '_unique_id': '9e0bb32af2274cf1846c0dbd774ec55e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.141 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e108361b-8bfb-4fc9-9f33-b1ab8e64dc29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.141841', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c407d596-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': '75ddcd79b6ab602da71c8d721b0884e45b78c57becc16729e44eb42081a310b4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.141841', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c407df0a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '0d3344738650f3c1943cf51dd9de53c8358a97532c80d86c9b14480fdf571d3d'}]}, 'timestamp': '2025-11-29 07:26:48.142344', '_unique_id': '612a11d3de2144af87da9b5db1688c1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.143 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.143 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.144 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.144 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f55292b6-a2cc-483c-81fd-28c96b9eb48e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.143640', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4081cae-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.985294705, 'message_signature': 'c4e7cefd217779aaf68fe4fdbceadc4dba4bee4d7e01acee2d51e1b275cc7787'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.143640', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4082758-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.985294705, 'message_signature': '2713dc4196b59f7fa3447b5634ade6a96f40b5e1ad528856cd14acc1ca881dfe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.143640', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c40834a0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.997816041, 'message_signature': '09b8eddca584b21c651918e5ae5d07d34f407c004f0a9a39316a04be4e23536f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.143640', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c408412a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.997816041, 'message_signature': '2d7886ad81af9bf6933e0ad7abd826c24efdef7196d56e98a9d192e3562b0f35'}]}, 'timestamp': '2025-11-29 07:26:48.144860', '_unique_id': 'cd1dd5ac4dcc4b67be8e89bbcd1022ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.146 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/cpu volume: 12300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.147 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/cpu volume: 12260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba97a70e-6881-4a86-8389-aa739e622b8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12300000000, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'timestamp': '2025-11-29T07:26:48.146826', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c4089c1a-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.044884672, 'message_signature': '27cfcb2e83ad462b9e54d1a6d935cfa05c3bf690cfe428ed60beaf427cfa61dc'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12260000000, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'timestamp': '2025-11-29T07:26:48.146826', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c408aa84-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.061499888, 'message_signature': 'fdd2eae764e8d616b38e518f7ccdc77263e0ad6ef96d65115bc0428e60877a68'}]}, 'timestamp': '2025-11-29 07:26:48.147611', '_unique_id': 'e5711b14f6d54700ae6d83a0512e9c63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.149 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.write.latency volume: 40864890008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.150 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.150 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.write.latency volume: 35243136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.151 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fa8d090-dc01-4d9e-8666-7474aa53d243', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40864890008, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-vda', 'timestamp': '2025-11-29T07:26:48.149884', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c40913c0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': '58634f98f4e8dfa983aec72923ec6901b4616b6a847faf3fb4e72f72b0cd0f20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-sda', 'timestamp': '2025-11-29T07:26:48.149884', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'instance-00000082', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4092590-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.914588942, 'message_signature': '31d7e3b7516980cc45053b2bc915b383775332f06801b330409f0891f174ac52'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35243136, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-vda', 'timestamp': '2025-11-29T07:26:48.149884', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c40933a0-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': 'd7c054d50cde8f4e8b07b82d51fc1e2d7ad676d1cc02f507a38aa37b5b8d73b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '220b7865-2248-43ba-865a-b2314b5a6e47-sda', 'timestamp': '2025-11-29T07:26:48.149884', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'instance-0000007e', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4093f94-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6670.949374514, 'message_signature': 'ebb8821a6cb089788d8dac288b5f49561dd147780039a9f640ba86cb4780790c'}]}, 'timestamp': '2025-11-29 07:26:48.151448', '_unique_id': 'c58d35ee4c0c4a52a3573bd6f9313cca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.154 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.154 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0463a231-c68d-47a6-acb9-17a9ac540154', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.154023', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c409b596-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': 'f0cfa6f81f7eda6a8716608af72abb4c2d4dc242dd99410dec031b5f8165e59e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.154023', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c409c720-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': 'e1db7467d87d3fb0da3803f614b42dafb47946dac423e6936763c6b1715da101'}]}, 'timestamp': '2025-11-29 07:26:48.154876', '_unique_id': 'f372b0da4f1445f9ab050aba17a8d53a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.156 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.157 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ca62297-897f-4aa3-bcae-3c26acabea83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.156902', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c40a256c-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': '7e8b05717714810993759574aa17a940e91be4ae329da108bef45f754c16e20b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.156902', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c40a3692-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '158dcda0458bfb2c9f818d8f7d56250b56201094f2f396dc0e19f1286f2ce567'}]}, 'timestamp': '2025-11-29 07:26:48.157743', '_unique_id': 'ea08ea9c485b47969cad90d0b9cf8036'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.159 12 DEBUG ceilometer.compute.pollsters [-] f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.160 12 DEBUG ceilometer.compute.pollsters [-] 220b7865-2248-43ba-865a-b2314b5a6e47/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54524542-40f1-47c1-9e8e-42a02b8d05e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-00000082-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-tape473974d-ee', 'timestamp': '2025-11-29T07:26:48.159725', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1152076978', 'name': 'tape473974d-ee', 'instance_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:58:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape473974d-ee'}, 'message_id': 'c40a9308-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.012032232, 'message_signature': '9deb1f55fffed14658e07c8fe6f2473a36293eb745bbf02e5074e23d5697c9ed'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007e-220b7865-2248-43ba-865a-b2314b5a6e47-tap86264ec7-05', 'timestamp': '2025-11-29T07:26:48.159725', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1803543286', 'name': 'tap86264ec7-05', 'instance_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'instance_type': 'm1.nano', 'host': '95a8b22f0717a93a133cb0c7b9d0a7f7de90f9aefcabea9dff75ec25', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:25:aa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86264ec7-05'}, 'message_id': 'c40a9f88-ccf4-11f0-8a11-fa163ea726b4', 'monotonic_time': 6671.015364982, 'message_signature': '198886adcceb949bb671ad145572e952f62b6a05000d63763200d22b27ad9ee8'}]}, 'timestamp': '2025-11-29 07:26:48.160447', '_unique_id': 'fe7ab73993c649c4a3c0644d238032f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:26:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:26:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.166 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.167 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.195 187156 INFO nova.compute.manager [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Rescuing#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.196 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.196 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.196 187156 DEBUG nova.network.neutron [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.332 187156 DEBUG nova.compute.provider_tree [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.356 187156 DEBUG nova.scheduler.client.report [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.392 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.396 187156 DEBUG nova.network.neutron [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updated VIF entry in instance network info cache for port 117c88c8-f8df-49f6-aa22-1c554973f1ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.396 187156 DEBUG nova.network.neutron [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Updating instance_info_cache with network_info: [{"id": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "address": "fa:16:3e:6f:ae:fa", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap117c88c8-f8", "ovs_interfaceid": "117c88c8-f8df-49f6-aa22-1c554973f1ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "address": "fa:16:3e:0d:0e:3d", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:e3d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab95b3bf-94", "ovs_interfaceid": "ab95b3bf-94ed-4d6d-bf40-ce3672f08a71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.419 187156 DEBUG oslo_concurrency.lockutils [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-a047dabb-8e55-4bea-92aa-20b191da7b54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.434 187156 INFO nova.scheduler.client.report [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance a047dabb-8e55-4bea-92aa-20b191da7b54#033[00m
Nov 29 02:26:48 np0005539504 nova_compute[187152]: 2025-11-29 07:26:48.587 187156 DEBUG oslo_concurrency.lockutils [None req-b4a2133a-be43-43e8-8df3-3f2c1bfda848 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "a047dabb-8e55-4bea-92aa-20b191da7b54" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:48 np0005539504 podman[238284]: 2025-11-29 07:26:48.738282377 +0000 UTC m=+0.068383342 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:26:48 np0005539504 podman[238282]: 2025-11-29 07:26:48.743030325 +0000 UTC m=+0.070526931 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:26:48 np0005539504 podman[238283]: 2025-11-29 07:26:48.768406725 +0000 UTC m=+0.099204089 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7)
Nov 29 02:26:49 np0005539504 nova_compute[187152]: 2025-11-29 07:26:49.660 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:49 np0005539504 nova_compute[187152]: 2025-11-29 07:26:49.745 187156 DEBUG nova.compute.manager [req-a91cbbf3-bc35-4898-a8e5-cc7af1d0cc07 req-7d5cc0f9-e3a9-4764-a038-5abd5b94746f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Received event network-vif-deleted-ab95b3bf-94ed-4d6d-bf40-ce3672f08a71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:49 np0005539504 nova_compute[187152]: 2025-11-29 07:26:49.745 187156 INFO nova.compute.manager [req-a91cbbf3-bc35-4898-a8e5-cc7af1d0cc07 req-7d5cc0f9-e3a9-4764-a038-5abd5b94746f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Neutron deleted interface ab95b3bf-94ed-4d6d-bf40-ce3672f08a71; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:26:49 np0005539504 nova_compute[187152]: 2025-11-29 07:26:49.745 187156 DEBUG nova.network.neutron [req-a91cbbf3-bc35-4898-a8e5-cc7af1d0cc07 req-7d5cc0f9-e3a9-4764-a038-5abd5b94746f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 29 02:26:49 np0005539504 nova_compute[187152]: 2025-11-29 07:26:49.750 187156 DEBUG nova.compute.manager [req-a91cbbf3-bc35-4898-a8e5-cc7af1d0cc07 req-7d5cc0f9-e3a9-4764-a038-5abd5b94746f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Detach interface failed, port_id=ab95b3bf-94ed-4d6d-bf40-ce3672f08a71, reason: Instance a047dabb-8e55-4bea-92aa-20b191da7b54 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:26:51 np0005539504 nova_compute[187152]: 2025-11-29 07:26:51.396 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:51 np0005539504 nova_compute[187152]: 2025-11-29 07:26:51.589 187156 DEBUG nova.network.neutron [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updating instance_info_cache with network_info: [{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:51 np0005539504 nova_compute[187152]: 2025-11-29 07:26:51.784 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:52 np0005539504 nova_compute[187152]: 2025-11-29 07:26:52.137 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:26:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:53Z|00517|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 02:26:53 np0005539504 nova_compute[187152]: 2025-11-29 07:26:53.684 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:53 np0005539504 podman[238344]: 2025-11-29 07:26:53.71632094 +0000 UTC m=+0.058178351 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:26:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:53Z|00518|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 02:26:53 np0005539504 nova_compute[187152]: 2025-11-29 07:26:53.908 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:54 np0005539504 kernel: tape473974d-ee (unregistering): left promiscuous mode
Nov 29 02:26:54 np0005539504 NetworkManager[55210]: <info>  [1764401214.3587] device (tape473974d-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:26:54 np0005539504 nova_compute[187152]: 2025-11-29 07:26:54.367 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:54Z|00519|binding|INFO|Releasing lport e473974d-ee78-43f1-9e3a-6baa18151417 from this chassis (sb_readonly=0)
Nov 29 02:26:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:54Z|00520|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 down in Southbound
Nov 29 02:26:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:26:54Z|00521|binding|INFO|Removing iface tape473974d-ee ovn-installed in OVS
Nov 29 02:26:54 np0005539504 nova_compute[187152]: 2025-11-29 07:26:54.372 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:54 np0005539504 nova_compute[187152]: 2025-11-29 07:26:54.403 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:54 np0005539504 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000082.scope: Deactivated successfully.
Nov 29 02:26:54 np0005539504 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000082.scope: Consumed 15.404s CPU time.
Nov 29 02:26:54 np0005539504 systemd-machined[153423]: Machine qemu-67-instance-00000082 terminated.
Nov 29 02:26:54 np0005539504 podman[238368]: 2025-11-29 07:26:54.511394741 +0000 UTC m=+0.125124194 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:26:54 np0005539504 nova_compute[187152]: 2025-11-29 07:26:54.662 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:54.956 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:58:b9 10.100.0.13'], port_security=['fa:16:3e:71:58:b9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e473974d-ee78-43f1-9e3a-6baa18151417) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:54.957 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e473974d-ee78-43f1-9e3a-6baa18151417 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:26:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:54.959 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:26:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:54.976 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7692230b-8bb4-4b95-9dd3-5a3c514a662b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.017 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[49609a4e-4e27-4941-afc1-22590ea054c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.022 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c16e1abc-f366-4559-ac4b-afb851d54bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.057 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2bd036-9f93-4f8c-9082-300b4f1069ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.087 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8b82a8-6027-435f-a048-0d60291e99bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238421, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.113 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2df4a7a1-855e-4907-bb54-ce736ca45a8a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662213, 'tstamp': 662213}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238422, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662217, 'tstamp': 662217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238422, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.115 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:55 np0005539504 nova_compute[187152]: 2025-11-29 07:26:55.117 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:55 np0005539504 nova_compute[187152]: 2025-11-29 07:26:55.123 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.124 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.124 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.124 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:26:55.125 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:55 np0005539504 nova_compute[187152]: 2025-11-29 07:26:55.156 187156 INFO nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:26:55 np0005539504 nova_compute[187152]: 2025-11-29 07:26:55.163 187156 INFO nova.virt.libvirt.driver [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance destroyed successfully.#033[00m
Nov 29 02:26:55 np0005539504 nova_compute[187152]: 2025-11-29 07:26:55.164 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:55 np0005539504 nova_compute[187152]: 2025-11-29 07:26:55.914 187156 INFO nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Attempting a stable device rescue#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.335 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.340 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.341 187156 INFO nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Creating image(s)#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.342 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.342 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.343 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.343 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.359 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "fa0e20a5a8d2535a22db09211fcaa6a093d1698f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.360 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "fa0e20a5a8d2535a22db09211fcaa6a093d1698f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:56 np0005539504 nova_compute[187152]: 2025-11-29 07:26:56.447 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:57 np0005539504 nova_compute[187152]: 2025-11-29 07:26:57.366 187156 DEBUG nova.compute.manager [req-caeba0f1-d2fe-4f63-bb7d-4543ac6ce6ea req-ded4097f-5762-4022-b604-f37709ffb664 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:57 np0005539504 nova_compute[187152]: 2025-11-29 07:26:57.366 187156 DEBUG oslo_concurrency.lockutils [req-caeba0f1-d2fe-4f63-bb7d-4543ac6ce6ea req-ded4097f-5762-4022-b604-f37709ffb664 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:57 np0005539504 nova_compute[187152]: 2025-11-29 07:26:57.366 187156 DEBUG oslo_concurrency.lockutils [req-caeba0f1-d2fe-4f63-bb7d-4543ac6ce6ea req-ded4097f-5762-4022-b604-f37709ffb664 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:57 np0005539504 nova_compute[187152]: 2025-11-29 07:26:57.367 187156 DEBUG oslo_concurrency.lockutils [req-caeba0f1-d2fe-4f63-bb7d-4543ac6ce6ea req-ded4097f-5762-4022-b604-f37709ffb664 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:57 np0005539504 nova_compute[187152]: 2025-11-29 07:26:57.367 187156 DEBUG nova.compute.manager [req-caeba0f1-d2fe-4f63-bb7d-4543ac6ce6ea req-ded4097f-5762-4022-b604-f37709ffb664 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:57 np0005539504 nova_compute[187152]: 2025-11-29 07:26:57.367 187156 WARNING nova.compute.manager [req-caeba0f1-d2fe-4f63-bb7d-4543ac6ce6ea req-ded4097f-5762-4022-b604-f37709ffb664 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:27:00 np0005539504 nova_compute[187152]: 2025-11-29 07:27:00.652 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:00 np0005539504 nova_compute[187152]: 2025-11-29 07:27:00.653 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401204.6140988, a047dabb-8e55-4bea-92aa-20b191da7b54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:00 np0005539504 nova_compute[187152]: 2025-11-29 07:27:00.653 187156 INFO nova.compute.manager [-] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:27:00 np0005539504 nova_compute[187152]: 2025-11-29 07:27:00.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:00 np0005539504 nova_compute[187152]: 2025-11-29 07:27:00.729 187156 DEBUG nova.compute.manager [None req-1acc1a0c-4418-41fe-a93b-41d8bd77cfdb - - - - - -] [instance: a047dabb-8e55-4bea-92aa-20b191da7b54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:00 np0005539504 podman[238424]: 2025-11-29 07:27:00.7431419 +0000 UTC m=+0.080221460 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:27:01 np0005539504 nova_compute[187152]: 2025-11-29 07:27:01.450 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.082 187156 DEBUG nova.compute.manager [req-7a94fd27-6f93-4b28-bbb9-f39561a10f5e req-741bf124-7b63-474b-9c86-be571730ca23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.083 187156 DEBUG oslo_concurrency.lockutils [req-7a94fd27-6f93-4b28-bbb9-f39561a10f5e req-741bf124-7b63-474b-9c86-be571730ca23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.083 187156 DEBUG oslo_concurrency.lockutils [req-7a94fd27-6f93-4b28-bbb9-f39561a10f5e req-741bf124-7b63-474b-9c86-be571730ca23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.084 187156 DEBUG oslo_concurrency.lockutils [req-7a94fd27-6f93-4b28-bbb9-f39561a10f5e req-741bf124-7b63-474b-9c86-be571730ca23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.084 187156 DEBUG nova.compute.manager [req-7a94fd27-6f93-4b28-bbb9-f39561a10f5e req-741bf124-7b63-474b-9c86-be571730ca23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.085 187156 WARNING nova.compute.manager [req-7a94fd27-6f93-4b28-bbb9-f39561a10f5e req-741bf124-7b63-474b-9c86-be571730ca23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.643 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.709 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.part --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.711 187156 DEBUG nova.virt.images [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] 9c7bd300-d736-4800-be24-71b67f5bfa9b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.711 187156 DEBUG nova.privsep.utils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:27:03 np0005539504 nova_compute[187152]: 2025-11-29 07:27:03.712 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.part /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.638 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.part /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.converted" returned: 0 in 1.926s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.649 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.669 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.714 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.715 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "fa0e20a5a8d2535a22db09211fcaa6a093d1698f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 9.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.732 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "fa0e20a5a8d2535a22db09211fcaa6a093d1698f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.733 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "fa0e20a5a8d2535a22db09211fcaa6a093d1698f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.748 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.827 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.828 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f,backing_fmt=raw /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:05 np0005539504 nova_compute[187152]: 2025-11-29 07:27:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:06 np0005539504 nova_compute[187152]: 2025-11-29 07:27:06.478 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:06 np0005539504 nova_compute[187152]: 2025-11-29 07:27:06.562 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f,backing_fmt=raw /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue" returned: 0 in 0.734s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:06 np0005539504 nova_compute[187152]: 2025-11-29 07:27:06.563 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "fa0e20a5a8d2535a22db09211fcaa6a093d1698f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:06 np0005539504 nova_compute[187152]: 2025-11-29 07:27:06.564 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'migration_context' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.022 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.029 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Start _get_guest_xml network_info=[{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:71:58:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9c7bd300-d736-4800-be24-71b67f5bfa9b', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.031 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'resources' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.309 187156 WARNING nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.321 187156 DEBUG nova.virt.libvirt.host [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.323 187156 DEBUG nova.virt.libvirt.host [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.326 187156 DEBUG nova.virt.libvirt.host [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.328 187156 DEBUG nova.virt.libvirt.host [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.329 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.330 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.330 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.330 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.330 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.331 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.331 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.331 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.332 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.332 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.332 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.332 187156 DEBUG nova.virt.hardware [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:27:07 np0005539504 nova_compute[187152]: 2025-11-29 07:27:07.333 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.010 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.069 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.070 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.071 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.073 187156 DEBUG oslo_concurrency.lockutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.075 187156 DEBUG nova.virt.libvirt.vif [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:26:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1152076978',display_name='tempest-ServerStableDeviceRescueTest-server-1152076978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1152076978',id=130,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:26:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-ujcqh7p9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:26:45Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=f63eb3ac-909c-46ed-b7ee-2e4f04b0998c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:71:58:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.076 187156 DEBUG nova.network.os_vif_util [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:71:58:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.079 187156 DEBUG nova.network.os_vif_util [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:27:08 np0005539504 nova_compute[187152]: 2025-11-29 07:27:08.081 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:08 np0005539504 podman[238468]: 2025-11-29 07:27:08.744825808 +0000 UTC m=+0.079944033 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.346 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <uuid>f63eb3ac-909c-46ed-b7ee-2e4f04b0998c</uuid>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <name>instance-00000082</name>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1152076978</nova:name>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:27:07</nova:creationTime>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:user uuid="5be41a8530314f83bbecbb74b9276f2d">tempest-ServerStableDeviceRescueTest-2012111838-project-member</nova:user>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:project uuid="ac3bb322fa744e099b38e08abe12d0e2">tempest-ServerStableDeviceRescueTest-2012111838</nova:project>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        <nova:port uuid="e473974d-ee78-43f1-9e3a-6baa18151417">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <entry name="serial">f63eb3ac-909c-46ed-b7ee-2e4f04b0998c</entry>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <entry name="uuid">f63eb3ac-909c-46ed-b7ee-2e4f04b0998c</entry>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <target dev="vdb" bus="virtio"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <boot order="1"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:71:58:b9"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <target dev="tape473974d-ee"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/console.log" append="off"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:27:09 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:27:09 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:27:09 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:27:09 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.358 187156 INFO nova.virt.libvirt.driver [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance destroyed successfully.#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.657 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401214.6542516, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.658 187156 INFO nova.compute.manager [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.778 187156 DEBUG nova.compute.manager [None req-e1474050-eb35-4744-bc12-88c14e91e71f - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.782 187156 DEBUG nova.compute.manager [None req-e1474050-eb35-4744-bc12-88c14e91e71f - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.791 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.792 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.792 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.792 187156 DEBUG nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No VIF found with MAC fa:16:3e:71:58:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.793 187156 INFO nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Using config drive#033[00m
Nov 29 02:27:09 np0005539504 nova_compute[187152]: 2025-11-29 07:27:09.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:10 np0005539504 nova_compute[187152]: 2025-11-29 07:27:10.211 187156 INFO nova.compute.manager [None req-e1474050-eb35-4744-bc12-88c14e91e71f - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:27:10 np0005539504 nova_compute[187152]: 2025-11-29 07:27:10.673 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:10 np0005539504 nova_compute[187152]: 2025-11-29 07:27:10.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:10 np0005539504 nova_compute[187152]: 2025-11-29 07:27:10.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:27:11 np0005539504 nova_compute[187152]: 2025-11-29 07:27:11.435 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:11 np0005539504 nova_compute[187152]: 2025-11-29 07:27:11.481 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:11 np0005539504 nova_compute[187152]: 2025-11-29 07:27:11.982 187156 DEBUG nova.objects.instance [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'keypairs' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:13 np0005539504 nova_compute[187152]: 2025-11-29 07:27:13.853 187156 INFO nova.virt.libvirt.driver [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Creating config drive at /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config.rescue#033[00m
Nov 29 02:27:13 np0005539504 nova_compute[187152]: 2025-11-29 07:27:13.859 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7n1zdipg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.007 187156 DEBUG oslo_concurrency.processutils [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7n1zdipg" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:14 np0005539504 kernel: tape473974d-ee: entered promiscuous mode
Nov 29 02:27:14 np0005539504 NetworkManager[55210]: <info>  [1764401234.1261] manager: (tape473974d-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Nov 29 02:27:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:14Z|00522|binding|INFO|Claiming lport e473974d-ee78-43f1-9e3a-6baa18151417 for this chassis.
Nov 29 02:27:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:14Z|00523|binding|INFO|e473974d-ee78-43f1-9e3a-6baa18151417: Claiming fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.129 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:14Z|00524|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 ovn-installed in OVS
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.159 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:14 np0005539504 systemd-udevd[238521]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:27:14 np0005539504 NetworkManager[55210]: <info>  [1764401234.1839] device (tape473974d-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:27:14 np0005539504 NetworkManager[55210]: <info>  [1764401234.1880] device (tape473974d-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:27:14 np0005539504 systemd-machined[153423]: New machine qemu-68-instance-00000082.
Nov 29 02:27:14 np0005539504 systemd[1]: Started Virtual Machine qemu-68-instance-00000082.
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.575 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401234.574024, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.575 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.758 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:58:b9 10.100.0.13'], port_security=['fa:16:3e:71:58:b9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e473974d-ee78-43f1-9e3a-6baa18151417) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:14Z|00525|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 up in Southbound
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.760 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e473974d-ee78-43f1-9e3a-6baa18151417 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.762 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.789 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.790 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7c5e9d-8e9e-44e4-a53c-c854e524457c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.793 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.830 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[37bf9cc2-022c-4dfa-9d03-d3c935a8acec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.836 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[01d0653d-a59d-40ea-aca3-ba811c6db4a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.865 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1ddfd392-d304-4993-a42b-cc63244caa1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.883 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4e8e2a4f-c51a-4b7a-bea1-af0d969612da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238544, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.898 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e63b9190-3936-402d-8f16-4aa8f1133064]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662213, 'tstamp': 662213}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238545, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662217, 'tstamp': 662217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238545, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.900 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.902 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.903 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.905 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.906 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.906 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:14.906 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.912 187156 DEBUG nova.compute.manager [None req-e15633e2-5846-4af2-bb69-f1d2d5b2d86e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:14 np0005539504 nova_compute[187152]: 2025-11-29 07:27:14.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.574 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.575 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401234.5778823, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.576 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.676 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.776 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.781 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:15 np0005539504 nova_compute[187152]: 2025-11-29 07:27:15.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.018 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.019 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.019 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.019 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.356 187156 DEBUG nova.compute.manager [req-6adaa8a3-d97a-4c9d-8a8a-c326778003cf req-287eb3d9-8070-45b9-802d-1186d526aade 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.357 187156 DEBUG oslo_concurrency.lockutils [req-6adaa8a3-d97a-4c9d-8a8a-c326778003cf req-287eb3d9-8070-45b9-802d-1186d526aade 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.358 187156 DEBUG oslo_concurrency.lockutils [req-6adaa8a3-d97a-4c9d-8a8a-c326778003cf req-287eb3d9-8070-45b9-802d-1186d526aade 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.358 187156 DEBUG oslo_concurrency.lockutils [req-6adaa8a3-d97a-4c9d-8a8a-c326778003cf req-287eb3d9-8070-45b9-802d-1186d526aade 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.358 187156 DEBUG nova.compute.manager [req-6adaa8a3-d97a-4c9d-8a8a-c326778003cf req-287eb3d9-8070-45b9-802d-1186d526aade 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.358 187156 WARNING nova.compute.manager [req-6adaa8a3-d97a-4c9d-8a8a-c326778003cf req-287eb3d9-8070-45b9-802d-1186d526aade 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state rescued and task_state None.#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.483 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.645 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.715 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.716 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.771 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.772 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.831 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.833 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.895 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c/disk.rescue --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.903 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.977 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:16 np0005539504 nova_compute[187152]: 2025-11-29 07:27:16.978 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:17 np0005539504 nova_compute[187152]: 2025-11-29 07:27:17.032 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:17 np0005539504 nova_compute[187152]: 2025-11-29 07:27:17.211 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:27:17 np0005539504 nova_compute[187152]: 2025-11-29 07:27:17.214 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5399MB free_disk=73.03278350830078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:27:17 np0005539504 nova_compute[187152]: 2025-11-29 07:27:17.214 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:17 np0005539504 nova_compute[187152]: 2025-11-29 07:27:17.214 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.111 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 220b7865-2248-43ba-865a-b2314b5a6e47 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.112 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance f63eb3ac-909c-46ed-b7ee-2e4f04b0998c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.112 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.113 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.373 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.489 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.594 187156 DEBUG nova.compute.manager [req-e05ec05f-1728-47fd-953a-91b245ff37e1 req-ee98b9bd-a93b-4aad-a2e3-9f70069089e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.597 187156 DEBUG oslo_concurrency.lockutils [req-e05ec05f-1728-47fd-953a-91b245ff37e1 req-ee98b9bd-a93b-4aad-a2e3-9f70069089e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.598 187156 DEBUG oslo_concurrency.lockutils [req-e05ec05f-1728-47fd-953a-91b245ff37e1 req-ee98b9bd-a93b-4aad-a2e3-9f70069089e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.598 187156 DEBUG oslo_concurrency.lockutils [req-e05ec05f-1728-47fd-953a-91b245ff37e1 req-ee98b9bd-a93b-4aad-a2e3-9f70069089e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.599 187156 DEBUG nova.compute.manager [req-e05ec05f-1728-47fd-953a-91b245ff37e1 req-ee98b9bd-a93b-4aad-a2e3-9f70069089e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.599 187156 WARNING nova.compute.manager [req-e05ec05f-1728-47fd-953a-91b245ff37e1 req-ee98b9bd-a93b-4aad-a2e3-9f70069089e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state rescued and task_state None.#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.603 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:27:18 np0005539504 nova_compute[187152]: 2025-11-29 07:27:18.604 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:19 np0005539504 podman[238567]: 2025-11-29 07:27:19.736251617 +0000 UTC m=+0.063969815 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 02:27:19 np0005539504 podman[238566]: 2025-11-29 07:27:19.753103508 +0000 UTC m=+0.080913449 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Nov 29 02:27:19 np0005539504 podman[238565]: 2025-11-29 07:27:19.753199801 +0000 UTC m=+0.081010492 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:27:20 np0005539504 nova_compute[187152]: 2025-11-29 07:27:20.678 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:21 np0005539504 nova_compute[187152]: 2025-11-29 07:27:21.485 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:21 np0005539504 nova_compute[187152]: 2025-11-29 07:27:21.605 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:21 np0005539504 nova_compute[187152]: 2025-11-29 07:27:21.606 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:27:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:22.968 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:22.968 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:22.969 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:23 np0005539504 nova_compute[187152]: 2025-11-29 07:27:23.635 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:27:23 np0005539504 nova_compute[187152]: 2025-11-29 07:27:23.636 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:27:23 np0005539504 nova_compute[187152]: 2025-11-29 07:27:23.636 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:27:24 np0005539504 podman[238624]: 2025-11-29 07:27:24.788732041 +0000 UTC m=+0.091403969 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:27:24 np0005539504 podman[238625]: 2025-11-29 07:27:24.800048274 +0000 UTC m=+0.113965494 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:27:24 np0005539504 nova_compute[187152]: 2025-11-29 07:27:24.988 187156 INFO nova.compute.manager [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Unrescuing#033[00m
Nov 29 02:27:24 np0005539504 nova_compute[187152]: 2025-11-29 07:27:24.989 187156 DEBUG oslo_concurrency.lockutils [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:27:24 np0005539504 nova_compute[187152]: 2025-11-29 07:27:24.989 187156 DEBUG oslo_concurrency.lockutils [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:27:24 np0005539504 nova_compute[187152]: 2025-11-29 07:27:24.989 187156 DEBUG nova.network.neutron [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:27:25 np0005539504 nova_compute[187152]: 2025-11-29 07:27:25.683 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:26 np0005539504 nova_compute[187152]: 2025-11-29 07:27:26.542 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:28Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:27:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:28Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.538 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updating instance_info_cache with network_info: [{"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.584 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-220b7865-2248-43ba-865a-b2314b5a6e47" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.585 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.585 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.585 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.586 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.586 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.603 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:29 np0005539504 nova_compute[187152]: 2025-11-29 07:27:29.642 187156 DEBUG nova.network.neutron [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updating instance_info_cache with network_info: [{"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.140 187156 DEBUG oslo_concurrency.lockutils [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.141 187156 DEBUG nova.objects.instance [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'flavor' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:30 np0005539504 kernel: tape473974d-ee (unregistering): left promiscuous mode
Nov 29 02:27:30 np0005539504 NetworkManager[55210]: <info>  [1764401250.4890] device (tape473974d-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.503 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:30Z|00526|binding|INFO|Releasing lport e473974d-ee78-43f1-9e3a-6baa18151417 from this chassis (sb_readonly=0)
Nov 29 02:27:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:30Z|00527|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 down in Southbound
Nov 29 02:27:30 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:30Z|00528|binding|INFO|Removing iface tape473974d-ee ovn-installed in OVS
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.511 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.522 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.525 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:58:b9 10.100.0.13'], port_security=['fa:16:3e:71:58:b9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e473974d-ee78-43f1-9e3a-6baa18151417) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.528 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e473974d-ee78-43f1-9e3a-6baa18151417 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.530 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.555 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1e30a450-0860-46b8-ba27-d54807bae73a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:30 np0005539504 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000082.scope: Deactivated successfully.
Nov 29 02:27:30 np0005539504 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000082.scope: Consumed 13.359s CPU time.
Nov 29 02:27:30 np0005539504 systemd-machined[153423]: Machine qemu-68-instance-00000082 terminated.
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.599 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6bd227-9081-42d6-b4b2-63413d05d988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.603 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8515f1-b0cf-45e5-99ee-b0247c65080a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.639 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ed27de75-1453-406f-ae00-fbb17db349aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.664 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[48ace90b-ba56-451e-ba9c-7d49b970c4cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238686, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.687 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.687 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a505528e-8438-406f-ad71-92d790f78996]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662213, 'tstamp': 662213}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238689, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662217, 'tstamp': 662217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238689, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.689 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.696 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.697 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.698 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.698 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:30.699 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.751 187156 INFO nova.virt.libvirt.driver [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance destroyed successfully.#033[00m
Nov 29 02:27:30 np0005539504 nova_compute[187152]: 2025-11-29 07:27:30.752 187156 DEBUG nova.objects.instance [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:31 np0005539504 nova_compute[187152]: 2025-11-29 07:27:31.543 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:31 np0005539504 systemd-udevd[238678]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:27:31 np0005539504 kernel: tape473974d-ee: entered promiscuous mode
Nov 29 02:27:31 np0005539504 NetworkManager[55210]: <info>  [1764401251.6739] manager: (tape473974d-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Nov 29 02:27:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:31Z|00529|binding|INFO|Claiming lport e473974d-ee78-43f1-9e3a-6baa18151417 for this chassis.
Nov 29 02:27:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:31Z|00530|binding|INFO|e473974d-ee78-43f1-9e3a-6baa18151417: Claiming fa:16:3e:71:58:b9 10.100.0.13
Nov 29 02:27:31 np0005539504 nova_compute[187152]: 2025-11-29 07:27:31.678 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:31 np0005539504 NetworkManager[55210]: <info>  [1764401251.6851] device (tape473974d-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:27:31 np0005539504 NetworkManager[55210]: <info>  [1764401251.6857] device (tape473974d-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:27:31 np0005539504 podman[238711]: 2025-11-29 07:27:31.68741458 +0000 UTC m=+0.068259640 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:27:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:31Z|00531|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 ovn-installed in OVS
Nov 29 02:27:31 np0005539504 nova_compute[187152]: 2025-11-29 07:27:31.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:31 np0005539504 systemd-machined[153423]: New machine qemu-69-instance-00000082.
Nov 29 02:27:31 np0005539504 systemd[1]: Started Virtual Machine qemu-69-instance-00000082.
Nov 29 02:27:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:31Z|00532|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 up in Southbound
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.850 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:58:b9 10.100.0.13'], port_security=['fa:16:3e:71:58:b9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e473974d-ee78-43f1-9e3a-6baa18151417) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.851 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e473974d-ee78-43f1-9e3a-6baa18151417 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.852 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.868 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[852bac61-592e-4b96-bd4c-8834d2322cb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.896 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f31e1629-0e6a-4d48-a813-cabcbc6603fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.899 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6a117b2d-5c9d-4977-b012-14a5ab9ea93f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.928 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[8090d829-8b5e-4c71-8f9e-396c0d957e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.943 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0eda8d-0b07-4b57-b388-91ccf1fad96a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238759, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.963 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7264cf1f-1e8d-4ae6-b89e-9282520f9748]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662213, 'tstamp': 662213}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238760, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662217, 'tstamp': 662217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238760, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.965 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:31 np0005539504 nova_compute[187152]: 2025-11-29 07:27:31.966 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:31 np0005539504 nova_compute[187152]: 2025-11-29 07:27:31.967 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.967 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.968 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.968 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:31.968 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:32 np0005539504 nova_compute[187152]: 2025-11-29 07:27:32.367 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for f63eb3ac-909c-46ed-b7ee-2e4f04b0998c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:27:32 np0005539504 nova_compute[187152]: 2025-11-29 07:27:32.367 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401252.3668118, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:32 np0005539504 nova_compute[187152]: 2025-11-29 07:27:32.368 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:27:32 np0005539504 nova_compute[187152]: 2025-11-29 07:27:32.370 187156 DEBUG nova.compute.manager [None req-4b5c64ea-af8d-4385-ace2-0167556c6fa7 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:33 np0005539504 nova_compute[187152]: 2025-11-29 07:27:33.857 187156 DEBUG nova.compute.manager [req-4f0b9a4a-afa8-440e-af46-9105fd665fd0 req-ecccbe94-e068-4bd8-8ef2-97d75a94ce42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:33 np0005539504 nova_compute[187152]: 2025-11-29 07:27:33.859 187156 DEBUG oslo_concurrency.lockutils [req-4f0b9a4a-afa8-440e-af46-9105fd665fd0 req-ecccbe94-e068-4bd8-8ef2-97d75a94ce42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:33 np0005539504 nova_compute[187152]: 2025-11-29 07:27:33.860 187156 DEBUG oslo_concurrency.lockutils [req-4f0b9a4a-afa8-440e-af46-9105fd665fd0 req-ecccbe94-e068-4bd8-8ef2-97d75a94ce42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:33 np0005539504 nova_compute[187152]: 2025-11-29 07:27:33.860 187156 DEBUG oslo_concurrency.lockutils [req-4f0b9a4a-afa8-440e-af46-9105fd665fd0 req-ecccbe94-e068-4bd8-8ef2-97d75a94ce42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:33 np0005539504 nova_compute[187152]: 2025-11-29 07:27:33.861 187156 DEBUG nova.compute.manager [req-4f0b9a4a-afa8-440e-af46-9105fd665fd0 req-ecccbe94-e068-4bd8-8ef2-97d75a94ce42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:33 np0005539504 nova_compute[187152]: 2025-11-29 07:27:33.862 187156 WARNING nova.compute.manager [req-4f0b9a4a-afa8-440e-af46-9105fd665fd0 req-ecccbe94-e068-4bd8-8ef2-97d75a94ce42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state rescued and task_state unrescuing.#033[00m
Nov 29 02:27:34 np0005539504 nova_compute[187152]: 2025-11-29 07:27:34.231 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:34 np0005539504 nova_compute[187152]: 2025-11-29 07:27:34.235 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:34 np0005539504 nova_compute[187152]: 2025-11-29 07:27:34.840 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401252.3683882, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:34 np0005539504 nova_compute[187152]: 2025-11-29 07:27:34.841 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:27:35 np0005539504 nova_compute[187152]: 2025-11-29 07:27:35.129 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:35 np0005539504 nova_compute[187152]: 2025-11-29 07:27:35.137 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:35 np0005539504 nova_compute[187152]: 2025-11-29 07:27:35.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.058 187156 DEBUG nova.compute.manager [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.059 187156 DEBUG oslo_concurrency.lockutils [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.059 187156 DEBUG oslo_concurrency.lockutils [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.060 187156 DEBUG oslo_concurrency.lockutils [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.060 187156 DEBUG nova.compute.manager [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.060 187156 WARNING nova.compute.manager [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.060 187156 DEBUG nova.compute.manager [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.061 187156 DEBUG oslo_concurrency.lockutils [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.061 187156 DEBUG oslo_concurrency.lockutils [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.061 187156 DEBUG oslo_concurrency.lockutils [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.061 187156 DEBUG nova.compute.manager [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.061 187156 WARNING nova.compute.manager [req-4f15fdc7-534d-4767-b776-265925143d06 req-34e7e1b2-c492-465b-b3e4-134714862452 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:27:36 np0005539504 nova_compute[187152]: 2025-11-29 07:27:36.583 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.238 187156 DEBUG nova.compute.manager [req-8be0e627-f7a2-45b9-851c-ed115693d039 req-46f059cd-5e06-42e5-b78c-f345b2be82dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.239 187156 DEBUG oslo_concurrency.lockutils [req-8be0e627-f7a2-45b9-851c-ed115693d039 req-46f059cd-5e06-42e5-b78c-f345b2be82dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.239 187156 DEBUG oslo_concurrency.lockutils [req-8be0e627-f7a2-45b9-851c-ed115693d039 req-46f059cd-5e06-42e5-b78c-f345b2be82dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.240 187156 DEBUG oslo_concurrency.lockutils [req-8be0e627-f7a2-45b9-851c-ed115693d039 req-46f059cd-5e06-42e5-b78c-f345b2be82dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.240 187156 DEBUG nova.compute.manager [req-8be0e627-f7a2-45b9-851c-ed115693d039 req-46f059cd-5e06-42e5-b78c-f345b2be82dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.240 187156 WARNING nova.compute.manager [req-8be0e627-f7a2-45b9-851c-ed115693d039 req-46f059cd-5e06-42e5-b78c-f345b2be82dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.483 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.483 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.485 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.695 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.696 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.697 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.697 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.697 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.720 187156 INFO nova.compute.manager [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Terminating instance#033[00m
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.731 187156 DEBUG nova.compute.manager [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:27:38 np0005539504 kernel: tape473974d-ee (unregistering): left promiscuous mode
Nov 29 02:27:38 np0005539504 NetworkManager[55210]: <info>  [1764401258.7529] device (tape473974d-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.802 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:38Z|00533|binding|INFO|Releasing lport e473974d-ee78-43f1-9e3a-6baa18151417 from this chassis (sb_readonly=0)
Nov 29 02:27:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:38Z|00534|binding|INFO|Setting lport e473974d-ee78-43f1-9e3a-6baa18151417 down in Southbound
Nov 29 02:27:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:38Z|00535|binding|INFO|Removing iface tape473974d-ee ovn-installed in OVS
Nov 29 02:27:38 np0005539504 nova_compute[187152]: 2025-11-29 07:27:38.816 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539504 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000082.scope: Deactivated successfully.
Nov 29 02:27:38 np0005539504 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000082.scope: Consumed 7.167s CPU time.
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.844 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:58:b9 10.100.0.13'], port_security=['fa:16:3e:71:58:b9 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f63eb3ac-909c-46ed-b7ee-2e4f04b0998c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=e473974d-ee78-43f1-9e3a-6baa18151417) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.846 104164 INFO neutron.agent.ovn.metadata.agent [-] Port e473974d-ee78-43f1-9e3a-6baa18151417 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.847 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39#033[00m
Nov 29 02:27:38 np0005539504 systemd-machined[153423]: Machine qemu-69-instance-00000082 terminated.
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.869 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7604fc1a-4aa6-4a80-9855-ae29f4d1aafe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:38 np0005539504 podman[238773]: 2025-11-29 07:27:38.901665603 +0000 UTC m=+0.080632681 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.921 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f428c175-172a-498a-8eae-1034489b7913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.926 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a288415e-d07d-46db-86e4-2704ddf83834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:38 np0005539504 NetworkManager[55210]: <info>  [1764401258.9575] manager: (tape473974d-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.968 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[398abb41-faf9-4145-97a8-5a1b9e4f11ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:38.995 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[533dc6a1-afc1-4229-90c6-8a6471f17788]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662200, 'reachable_time': 18152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238813, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.016 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[60728de2-7422-4dbe-8f9f-dd827aa87629]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662213, 'tstamp': 662213}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238821, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap240f16d8-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662217, 'tstamp': 662217}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238821, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.016 187156 INFO nova.virt.libvirt.driver [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Instance destroyed successfully.#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.017 187156 DEBUG nova.objects.instance [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'resources' on Instance uuid f63eb3ac-909c-46ed-b7ee-2e4f04b0998c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.018 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.026 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.026 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.027 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.027 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:39.488 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.538 187156 DEBUG nova.virt.libvirt.vif [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:26:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1152076978',display_name='tempest-ServerStableDeviceRescueTest-server-1152076978',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1152076978',id=130,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:27:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-ujcqh7p9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:27:34Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=f63eb3ac-909c-46ed-b7ee-2e4f04b0998c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.539 187156 DEBUG nova.network.os_vif_util [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "e473974d-ee78-43f1-9e3a-6baa18151417", "address": "fa:16:3e:71:58:b9", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape473974d-ee", "ovs_interfaceid": "e473974d-ee78-43f1-9e3a-6baa18151417", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.540 187156 DEBUG nova.network.os_vif_util [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.540 187156 DEBUG os_vif [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.542 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.542 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape473974d-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.544 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.546 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.546 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.551 187156 INFO os_vif [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=e473974d-ee78-43f1-9e3a-6baa18151417,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape473974d-ee')#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.551 187156 INFO nova.virt.libvirt.driver [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Deleting instance files /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c_del#033[00m
Nov 29 02:27:39 np0005539504 nova_compute[187152]: 2025-11-29 07:27:39.552 187156 INFO nova.virt.libvirt.driver [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Deletion of /var/lib/nova/instances/f63eb3ac-909c-46ed-b7ee-2e4f04b0998c_del complete#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.504 187156 DEBUG nova.compute.manager [req-de1eb5aa-f379-44c3-9d49-5b78ebec8a10 req-f03afd3e-9d77-4570-9361-709865c30b67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.505 187156 DEBUG oslo_concurrency.lockutils [req-de1eb5aa-f379-44c3-9d49-5b78ebec8a10 req-f03afd3e-9d77-4570-9361-709865c30b67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.505 187156 DEBUG oslo_concurrency.lockutils [req-de1eb5aa-f379-44c3-9d49-5b78ebec8a10 req-f03afd3e-9d77-4570-9361-709865c30b67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.505 187156 DEBUG oslo_concurrency.lockutils [req-de1eb5aa-f379-44c3-9d49-5b78ebec8a10 req-f03afd3e-9d77-4570-9361-709865c30b67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.506 187156 DEBUG nova.compute.manager [req-de1eb5aa-f379-44c3-9d49-5b78ebec8a10 req-f03afd3e-9d77-4570-9361-709865c30b67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.506 187156 DEBUG nova.compute.manager [req-de1eb5aa-f379-44c3-9d49-5b78ebec8a10 req-f03afd3e-9d77-4570-9361-709865c30b67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-unplugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.642 187156 INFO nova.compute.manager [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Took 1.91 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.644 187156 DEBUG oslo.service.loopingcall [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.644 187156 DEBUG nova.compute.manager [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:27:40 np0005539504 nova_compute[187152]: 2025-11-29 07:27:40.644 187156 DEBUG nova.network.neutron [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:27:41 np0005539504 nova_compute[187152]: 2025-11-29 07:27:41.586 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:42 np0005539504 nova_compute[187152]: 2025-11-29 07:27:42.389 187156 DEBUG nova.network.neutron [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:42 np0005539504 nova_compute[187152]: 2025-11-29 07:27:42.633 187156 DEBUG nova.compute.manager [req-b0437ffc-d8d2-4470-babf-f5bb8113dbbd req-6e7d7276-55fb-45c4-8514-9097c16e603e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-deleted-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:42 np0005539504 nova_compute[187152]: 2025-11-29 07:27:42.634 187156 INFO nova.compute.manager [req-b0437ffc-d8d2-4470-babf-f5bb8113dbbd req-6e7d7276-55fb-45c4-8514-9097c16e603e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Neutron deleted interface e473974d-ee78-43f1-9e3a-6baa18151417; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:27:42 np0005539504 nova_compute[187152]: 2025-11-29 07:27:42.634 187156 DEBUG nova.network.neutron [req-b0437ffc-d8d2-4470-babf-f5bb8113dbbd req-6e7d7276-55fb-45c4-8514-9097c16e603e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:42 np0005539504 nova_compute[187152]: 2025-11-29 07:27:42.640 187156 INFO nova.compute.manager [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Took 2.00 seconds to deallocate network for instance.#033[00m
Nov 29 02:27:42 np0005539504 nova_compute[187152]: 2025-11-29 07:27:42.740 187156 DEBUG nova.compute.manager [req-b0437ffc-d8d2-4470-babf-f5bb8113dbbd req-6e7d7276-55fb-45c4-8514-9097c16e603e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Detach interface failed, port_id=e473974d-ee78-43f1-9e3a-6baa18151417, reason: Instance f63eb3ac-909c-46ed-b7ee-2e4f04b0998c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.006 187156 DEBUG nova.compute.manager [req-32cd636d-e790-4479-897a-4755efb78fed req-6ffd8203-6f45-4915-9e9f-46681db72ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.007 187156 DEBUG oslo_concurrency.lockutils [req-32cd636d-e790-4479-897a-4755efb78fed req-6ffd8203-6f45-4915-9e9f-46681db72ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.007 187156 DEBUG oslo_concurrency.lockutils [req-32cd636d-e790-4479-897a-4755efb78fed req-6ffd8203-6f45-4915-9e9f-46681db72ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.007 187156 DEBUG oslo_concurrency.lockutils [req-32cd636d-e790-4479-897a-4755efb78fed req-6ffd8203-6f45-4915-9e9f-46681db72ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.008 187156 DEBUG nova.compute.manager [req-32cd636d-e790-4479-897a-4755efb78fed req-6ffd8203-6f45-4915-9e9f-46681db72ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] No waiting events found dispatching network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.008 187156 WARNING nova.compute.manager [req-32cd636d-e790-4479-897a-4755efb78fed req-6ffd8203-6f45-4915-9e9f-46681db72ef8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Received unexpected event network-vif-plugged-e473974d-ee78-43f1-9e3a-6baa18151417 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.056 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.057 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.288 187156 DEBUG nova.compute.provider_tree [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.491 187156 DEBUG nova.scheduler.client.report [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.856 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:43 np0005539504 nova_compute[187152]: 2025-11-29 07:27:43.917 187156 INFO nova.scheduler.client.report [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Deleted allocations for instance f63eb3ac-909c-46ed-b7ee-2e4f04b0998c#033[00m
Nov 29 02:27:44 np0005539504 nova_compute[187152]: 2025-11-29 07:27:44.194 187156 DEBUG oslo_concurrency.lockutils [None req-76aae3a9-90af-4d11-b9fe-4a7f8bc9e02c 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f63eb3ac-909c-46ed-b7ee-2e4f04b0998c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:44 np0005539504 nova_compute[187152]: 2025-11-29 07:27:44.545 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:44 np0005539504 nova_compute[187152]: 2025-11-29 07:27:44.969 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:44 np0005539504 nova_compute[187152]: 2025-11-29 07:27:44.970 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:27:44 np0005539504 nova_compute[187152]: 2025-11-29 07:27:44.994 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.624 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.970 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.971 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.971 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.972 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.972 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:46 np0005539504 nova_compute[187152]: 2025-11-29 07:27:46.987 187156 INFO nova.compute.manager [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Terminating instance#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.004 187156 DEBUG nova.compute.manager [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:27:47 np0005539504 kernel: tap86264ec7-05 (unregistering): left promiscuous mode
Nov 29 02:27:47 np0005539504 NetworkManager[55210]: <info>  [1764401267.0311] device (tap86264ec7-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:27:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:47Z|00536|binding|INFO|Releasing lport 86264ec7-05bf-4512-ac97-016779ba241a from this chassis (sb_readonly=0)
Nov 29 02:27:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:47Z|00537|binding|INFO|Setting lport 86264ec7-05bf-4512-ac97-016779ba241a down in Southbound
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.034 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:27:47Z|00538|binding|INFO|Removing iface tap86264ec7-05 ovn-installed in OVS
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.038 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.049 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:25:aa 10.100.0.4'], port_security=['fa:16:3e:03:25:aa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '220b7865-2248-43ba-865a-b2314b5a6e47', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=86264ec7-05bf-4512-ac97-016779ba241a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.054 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 86264ec7-05bf-4512-ac97-016779ba241a in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.057 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.059 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e9236368-40e9-4729-b105-33e99caeb99b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.060 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace which is not needed anymore#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Nov 29 02:27:47 np0005539504 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000007e.scope: Consumed 17.710s CPU time.
Nov 29 02:27:47 np0005539504 systemd-machined[153423]: Machine qemu-66-instance-0000007e terminated.
Nov 29 02:27:47 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [NOTICE]   (237790) : haproxy version is 2.8.14-c23fe91
Nov 29 02:27:47 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [NOTICE]   (237790) : path to executable is /usr/sbin/haproxy
Nov 29 02:27:47 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [WARNING]  (237790) : Exiting Master process...
Nov 29 02:27:47 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [WARNING]  (237790) : Exiting Master process...
Nov 29 02:27:47 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [ALERT]    (237790) : Current worker (237792) exited with code 143 (Terminated)
Nov 29 02:27:47 np0005539504 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[237786]: [WARNING]  (237790) : All workers exited. Exiting... (0)
Nov 29 02:27:47 np0005539504 systemd[1]: libpod-0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b.scope: Deactivated successfully.
Nov 29 02:27:47 np0005539504 podman[238851]: 2025-11-29 07:27:47.239776016 +0000 UTC m=+0.058318044 container died 0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:27:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b-userdata-shm.mount: Deactivated successfully.
Nov 29 02:27:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay-1aa9c211fe7319cf8b1803d72b0251a9ea6c6464f3137dd58a4de53bdac7a8d9-merged.mount: Deactivated successfully.
Nov 29 02:27:47 np0005539504 podman[238851]: 2025-11-29 07:27:47.287628618 +0000 UTC m=+0.106170636 container cleanup 0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.294 187156 INFO nova.virt.libvirt.driver [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Instance destroyed successfully.#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.295 187156 DEBUG nova.objects.instance [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'resources' on Instance uuid 220b7865-2248-43ba-865a-b2314b5a6e47 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:47 np0005539504 systemd[1]: libpod-conmon-0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b.scope: Deactivated successfully.
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.312 187156 DEBUG nova.virt.libvirt.vif [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1803543286',display_name='tempest-ServerStableDeviceRescueTest-server-1803543286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1803543286',id=126,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-9i8if1ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:59Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=220b7865-2248-43ba-865a-b2314b5a6e47,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.313 187156 DEBUG nova.network.os_vif_util [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "86264ec7-05bf-4512-ac97-016779ba241a", "address": "fa:16:3e:03:25:aa", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86264ec7-05", "ovs_interfaceid": "86264ec7-05bf-4512-ac97-016779ba241a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.313 187156 DEBUG nova.network.os_vif_util [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.314 187156 DEBUG os_vif [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.316 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86264ec7-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.320 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.323 187156 INFO os_vif [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:25:aa,bridge_name='br-int',has_traffic_filtering=True,id=86264ec7-05bf-4512-ac97-016779ba241a,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86264ec7-05')#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.324 187156 INFO nova.virt.libvirt.driver [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Deleting instance files /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47_del#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.325 187156 INFO nova.virt.libvirt.driver [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Deletion of /var/lib/nova/instances/220b7865-2248-43ba-865a-b2314b5a6e47_del complete#033[00m
Nov 29 02:27:47 np0005539504 podman[238898]: 2025-11-29 07:27:47.364647281 +0000 UTC m=+0.049518597 container remove 0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.370 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8f308c45-0489-4615-ba1e-346ec61694b8]: (4, ('Sat Nov 29 07:27:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b)\n0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b\nSat Nov 29 07:27:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b)\n0d28c72bafed9ec3bbbe0c410155426c77d67d873fb41eada2b412a2d51b449b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.373 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[31245eba-e6f1-4f1f-a8e5-adb1309756cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.374 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.376 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 kernel: tap240f16d8-60: left promiscuous mode
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.400 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.405 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[90edc8a3-bf8a-4696-97e3-3e3f189a90d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.425 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[44beb888-3763-452d-beb7-0d8436f1547d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.426 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7475b665-8ad7-4aa4-9ac0-263c73ebd817]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.432 187156 INFO nova.compute.manager [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.432 187156 DEBUG oslo.service.loopingcall [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.433 187156 DEBUG nova.compute.manager [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.433 187156 DEBUG nova.network.neutron [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.448 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eb162b08-4914-4433-ac4b-e508f74cae83]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662190, 'reachable_time': 17545, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238914, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.451 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:27:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:27:47.452 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[648ee7ed-dd18-4821-825d-f6faa496f9d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:47 np0005539504 systemd[1]: run-netns-ovnmeta\x2d240f16d8\x2d602b\x2d4aa1\x2d8edb\x2de3a8d3674e39.mount: Deactivated successfully.
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.595 187156 DEBUG nova.compute.manager [req-c6632be1-c992-43d1-a65e-7747e839b26d req-dc45b324-752a-402f-944b-c12092cab44d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.597 187156 DEBUG oslo_concurrency.lockutils [req-c6632be1-c992-43d1-a65e-7747e839b26d req-dc45b324-752a-402f-944b-c12092cab44d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.597 187156 DEBUG oslo_concurrency.lockutils [req-c6632be1-c992-43d1-a65e-7747e839b26d req-dc45b324-752a-402f-944b-c12092cab44d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.598 187156 DEBUG oslo_concurrency.lockutils [req-c6632be1-c992-43d1-a65e-7747e839b26d req-dc45b324-752a-402f-944b-c12092cab44d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.598 187156 DEBUG nova.compute.manager [req-c6632be1-c992-43d1-a65e-7747e839b26d req-dc45b324-752a-402f-944b-c12092cab44d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:47 np0005539504 nova_compute[187152]: 2025-11-29 07:27:47.598 187156 DEBUG nova.compute.manager [req-c6632be1-c992-43d1-a65e-7747e839b26d req-dc45b324-752a-402f-944b-c12092cab44d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-unplugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:27:50 np0005539504 nova_compute[187152]: 2025-11-29 07:27:50.716 187156 DEBUG nova.compute.manager [req-3ce50b0a-613c-4e07-9e89-bf0a4bf9ea37 req-50e8aeb0-7a1b-4c92-b36a-2ed9858793a3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:50 np0005539504 nova_compute[187152]: 2025-11-29 07:27:50.717 187156 DEBUG oslo_concurrency.lockutils [req-3ce50b0a-613c-4e07-9e89-bf0a4bf9ea37 req-50e8aeb0-7a1b-4c92-b36a-2ed9858793a3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:50 np0005539504 nova_compute[187152]: 2025-11-29 07:27:50.717 187156 DEBUG oslo_concurrency.lockutils [req-3ce50b0a-613c-4e07-9e89-bf0a4bf9ea37 req-50e8aeb0-7a1b-4c92-b36a-2ed9858793a3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:50 np0005539504 nova_compute[187152]: 2025-11-29 07:27:50.717 187156 DEBUG oslo_concurrency.lockutils [req-3ce50b0a-613c-4e07-9e89-bf0a4bf9ea37 req-50e8aeb0-7a1b-4c92-b36a-2ed9858793a3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:50 np0005539504 nova_compute[187152]: 2025-11-29 07:27:50.718 187156 DEBUG nova.compute.manager [req-3ce50b0a-613c-4e07-9e89-bf0a4bf9ea37 req-50e8aeb0-7a1b-4c92-b36a-2ed9858793a3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] No waiting events found dispatching network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:50 np0005539504 nova_compute[187152]: 2025-11-29 07:27:50.718 187156 WARNING nova.compute.manager [req-3ce50b0a-613c-4e07-9e89-bf0a4bf9ea37 req-50e8aeb0-7a1b-4c92-b36a-2ed9858793a3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received unexpected event network-vif-plugged-86264ec7-05bf-4512-ac97-016779ba241a for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:27:50 np0005539504 podman[238917]: 2025-11-29 07:27:50.726521252 +0000 UTC m=+0.061559842 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Nov 29 02:27:50 np0005539504 podman[238916]: 2025-11-29 07:27:50.750863984 +0000 UTC m=+0.081920337 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:27:50 np0005539504 podman[238915]: 2025-11-29 07:27:50.773237974 +0000 UTC m=+0.104297117 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.365 187156 DEBUG nova.network.neutron [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.429 187156 INFO nova.compute.manager [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Took 4.00 seconds to deallocate network for instance.#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.507 187156 DEBUG nova.compute.manager [req-f83cf3aa-bd82-4a91-972d-ee5359bbe47b req-e317f009-d62e-4ffd-bbaa-d73533be5b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Received event network-vif-deleted-86264ec7-05bf-4512-ac97-016779ba241a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.519 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.519 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.598 187156 DEBUG nova.compute.provider_tree [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.627 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.644 187156 DEBUG nova.scheduler.client.report [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.684 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.744 187156 INFO nova.scheduler.client.report [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Deleted allocations for instance 220b7865-2248-43ba-865a-b2314b5a6e47#033[00m
Nov 29 02:27:51 np0005539504 nova_compute[187152]: 2025-11-29 07:27:51.872 187156 DEBUG oslo_concurrency.lockutils [None req-62d4fb89-7eec-450e-aaaa-2bc686fd3249 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "220b7865-2248-43ba-865a-b2314b5a6e47" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:52 np0005539504 nova_compute[187152]: 2025-11-29 07:27:52.320 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.105 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.106 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.145 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.342 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.343 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.350 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.350 187156 INFO nova.compute.claims [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.580 187156 DEBUG nova.compute.provider_tree [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.602 187156 DEBUG nova.scheduler.client.report [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.651 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.652 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.745 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.746 187156 DEBUG nova.network.neutron [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:27:53 np0005539504 nova_compute[187152]: 2025-11-29 07:27:53.776 187156 INFO nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.014 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401259.0126047, f63eb3ac-909c-46ed-b7ee-2e4f04b0998c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.015 187156 INFO nova.compute.manager [-] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.045 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.096 187156 DEBUG nova.compute.manager [None req-cdceb40e-3970-4cc7-b866-e8a693598cde - - - - - -] [instance: f63eb3ac-909c-46ed-b7ee-2e4f04b0998c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.321 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.324 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.325 187156 INFO nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Creating image(s)#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.326 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "/var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.327 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.328 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.358 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.457 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.459 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.460 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.475 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.575 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.577 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.611 187156 DEBUG nova.policy [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.634 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.635 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.636 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.707 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.709 187156 DEBUG nova.virt.disk.api [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Checking if we can resize image /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.710 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.783 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.784 187156 DEBUG nova.virt.disk.api [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Cannot resize image /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.784 187156 DEBUG nova.objects.instance [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'migration_context' on Instance uuid e53352bf-9c18-47a3-887b-ec5808266bd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.803 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.804 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Ensure instance console log exists: /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.804 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.804 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:54 np0005539504 nova_compute[187152]: 2025-11-29 07:27:54.805 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:55 np0005539504 podman[238992]: 2025-11-29 07:27:55.745633664 +0000 UTC m=+0.079109901 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:27:55 np0005539504 podman[238993]: 2025-11-29 07:27:55.787627439 +0000 UTC m=+0.116447861 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:27:56 np0005539504 nova_compute[187152]: 2025-11-29 07:27:56.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539504 nova_compute[187152]: 2025-11-29 07:27:57.322 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:58 np0005539504 nova_compute[187152]: 2025-11-29 07:27:58.068 187156 DEBUG nova.network.neutron [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Successfully created port: d83ec027-8cd9-43f6-9b97-5c3794b908e3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:27:58 np0005539504 nova_compute[187152]: 2025-11-29 07:27:58.962 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:59 np0005539504 nova_compute[187152]: 2025-11-29 07:27:59.936 187156 DEBUG nova.network.neutron [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Successfully updated port: d83ec027-8cd9-43f6-9b97-5c3794b908e3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:27:59 np0005539504 nova_compute[187152]: 2025-11-29 07:27:59.960 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "refresh_cache-e53352bf-9c18-47a3-887b-ec5808266bd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:27:59 np0005539504 nova_compute[187152]: 2025-11-29 07:27:59.961 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquired lock "refresh_cache-e53352bf-9c18-47a3-887b-ec5808266bd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:27:59 np0005539504 nova_compute[187152]: 2025-11-29 07:27:59.961 187156 DEBUG nova.network.neutron [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:28:00 np0005539504 nova_compute[187152]: 2025-11-29 07:28:00.262 187156 DEBUG nova.compute.manager [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-changed-d83ec027-8cd9-43f6-9b97-5c3794b908e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:00 np0005539504 nova_compute[187152]: 2025-11-29 07:28:00.262 187156 DEBUG nova.compute.manager [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Refreshing instance network info cache due to event network-changed-d83ec027-8cd9-43f6-9b97-5c3794b908e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:28:00 np0005539504 nova_compute[187152]: 2025-11-29 07:28:00.263 187156 DEBUG oslo_concurrency.lockutils [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e53352bf-9c18-47a3-887b-ec5808266bd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:00 np0005539504 nova_compute[187152]: 2025-11-29 07:28:00.491 187156 DEBUG nova.network.neutron [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:28:01 np0005539504 nova_compute[187152]: 2025-11-29 07:28:01.631 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.293 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401267.2917602, 220b7865-2248-43ba-865a-b2314b5a6e47 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.293 187156 INFO nova.compute.manager [-] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.311 187156 DEBUG nova.network.neutron [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Updating instance_info_cache with network_info: [{"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.351 187156 DEBUG nova.compute.manager [None req-e823e920-44ca-4901-97bb-f935f054e6c7 - - - - - -] [instance: 220b7865-2248-43ba-865a-b2314b5a6e47] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.369 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Releasing lock "refresh_cache-e53352bf-9c18-47a3-887b-ec5808266bd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.370 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Instance network_info: |[{"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.370 187156 DEBUG oslo_concurrency.lockutils [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e53352bf-9c18-47a3-887b-ec5808266bd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.371 187156 DEBUG nova.network.neutron [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Refreshing network info cache for port d83ec027-8cd9-43f6-9b97-5c3794b908e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.374 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Start _get_guest_xml network_info=[{"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.377 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.383 187156 WARNING nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.392 187156 DEBUG nova.virt.libvirt.host [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.393 187156 DEBUG nova.virt.libvirt.host [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.397 187156 DEBUG nova.virt.libvirt.host [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.398 187156 DEBUG nova.virt.libvirt.host [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.399 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.400 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.400 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.401 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.401 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.401 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.401 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.402 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.402 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.402 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.403 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.403 187156 DEBUG nova.virt.hardware [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.409 187156 DEBUG nova.virt.libvirt.vif [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:27:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-881623161',display_name='tempest-ListServerFiltersTestJSON-instance-881623161',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-881623161',id=133,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-0d45dxkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:27:54Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=e53352bf-9c18-47a3-887b-ec5808266bd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.409 187156 DEBUG nova.network.os_vif_util [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.410 187156 DEBUG nova.network.os_vif_util [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.411 187156 DEBUG nova.objects.instance [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid e53352bf-9c18-47a3-887b-ec5808266bd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.428 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <uuid>e53352bf-9c18-47a3-887b-ec5808266bd1</uuid>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <name>instance-00000085</name>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <memory>196608</memory>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-881623161</nova:name>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:28:02</nova:creationTime>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.micro">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:memory>192</nova:memory>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:user uuid="3e2a40601ced4de78fe1767769f262c0">tempest-ListServerFiltersTestJSON-1571311845-project-member</nova:user>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:project uuid="7843cfa993a1428aaaa660321ebba1ac">tempest-ListServerFiltersTestJSON-1571311845</nova:project>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        <nova:port uuid="d83ec027-8cd9-43f6-9b97-5c3794b908e3">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <entry name="serial">e53352bf-9c18-47a3-887b-ec5808266bd1</entry>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <entry name="uuid">e53352bf-9c18-47a3-887b-ec5808266bd1</entry>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.config"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:02:a1:8d"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <target dev="tapd83ec027-8c"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/console.log" append="off"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:28:02 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:28:02 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:28:02 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:28:02 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.430 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Preparing to wait for external event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.430 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.431 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.431 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.432 187156 DEBUG nova.virt.libvirt.vif [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:27:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-881623161',display_name='tempest-ListServerFiltersTestJSON-instance-881623161',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-881623161',id=133,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-0d45dxkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:27:54Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=e53352bf-9c18-47a3-887b-ec5808266bd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.432 187156 DEBUG nova.network.os_vif_util [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.433 187156 DEBUG nova.network.os_vif_util [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.433 187156 DEBUG os_vif [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.434 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.434 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.435 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.439 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.440 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd83ec027-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.441 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd83ec027-8c, col_values=(('external_ids', {'iface-id': 'd83ec027-8cd9-43f6-9b97-5c3794b908e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:a1:8d', 'vm-uuid': 'e53352bf-9c18-47a3-887b-ec5808266bd1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.444 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:02 np0005539504 NetworkManager[55210]: <info>  [1764401282.4445] manager: (tapd83ec027-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.446 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.454 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.457 187156 INFO os_vif [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c')#033[00m
Nov 29 02:28:02 np0005539504 podman[239046]: 2025-11-29 07:28:02.565408577 +0000 UTC m=+0.064866459 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.586 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.588 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.588 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No VIF found with MAC fa:16:3e:02:a1:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:28:02 np0005539504 nova_compute[187152]: 2025-11-29 07:28:02.589 187156 INFO nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Using config drive#033[00m
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.668 187156 INFO nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Creating config drive at /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.config#033[00m
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.674 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6666gtio execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.810 187156 DEBUG oslo_concurrency.processutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6666gtio" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:05 np0005539504 kernel: tapd83ec027-8c: entered promiscuous mode
Nov 29 02:28:05 np0005539504 NetworkManager[55210]: <info>  [1764401285.8847] manager: (tapd83ec027-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.884 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:05Z|00539|binding|INFO|Claiming lport d83ec027-8cd9-43f6-9b97-5c3794b908e3 for this chassis.
Nov 29 02:28:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:05Z|00540|binding|INFO|d83ec027-8cd9-43f6-9b97-5c3794b908e3: Claiming fa:16:3e:02:a1:8d 10.100.0.8
Nov 29 02:28:05 np0005539504 systemd-udevd[239082]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.917 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.918 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:a1:8d 10.100.0.8'], port_security=['fa:16:3e:02:a1:8d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e53352bf-9c18-47a3-887b-ec5808266bd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=d83ec027-8cd9-43f6-9b97-5c3794b908e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.919 104164 INFO neutron.agent.ovn.metadata.agent [-] Port d83ec027-8cd9-43f6-9b97-5c3794b908e3 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 bound to our chassis#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.921 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28412826-5463-46e4-95cb-a7d788b1ab15#033[00m
Nov 29 02:28:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:05Z|00541|binding|INFO|Setting lport d83ec027-8cd9-43f6-9b97-5c3794b908e3 ovn-installed in OVS
Nov 29 02:28:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:05Z|00542|binding|INFO|Setting lport d83ec027-8cd9-43f6-9b97-5c3794b908e3 up in Southbound
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.926 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:05 np0005539504 NetworkManager[55210]: <info>  [1764401285.9304] device (tapd83ec027-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:28:05 np0005539504 NetworkManager[55210]: <info>  [1764401285.9311] device (tapd83ec027-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:28:05 np0005539504 nova_compute[187152]: 2025-11-29 07:28:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:05 np0005539504 systemd-machined[153423]: New machine qemu-70-instance-00000085.
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.937 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4b520ae0-e5cc-4b20-9373-3937a993610e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.939 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28412826-51 in ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.940 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28412826-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.941 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdb9ce4-8332-4c3d-b239-6a8ed4a9789d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.942 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[73b7c71a-d15c-43bf-8604-ebe7d82734c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:05 np0005539504 systemd[1]: Started Virtual Machine qemu-70-instance-00000085.
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.959 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[1df8af9f-4c44-4e73-ab2b-b70ef233ed71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:05.975 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0972c685-9edc-42b9-9a80-1aa9932caedd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.017 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[31af2d69-b1f4-4129-9fab-c3076a68a295]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.023 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e594ca45-9295-4a7a-bd73-dc2df3f909ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 NetworkManager[55210]: <info>  [1764401286.0248] manager: (tap28412826-50): new Veth device (/org/freedesktop/NetworkManager/Devices/242)
Nov 29 02:28:06 np0005539504 systemd-udevd[239088]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.067 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7be33494-969f-419b-bb4b-ebaca4d9a172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.071 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[160fb209-0aaf-422a-bb6d-86da03c41151]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 NetworkManager[55210]: <info>  [1764401286.1042] device (tap28412826-50): carrier: link connected
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.112 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d26e9d45-5736-4081-9902-dff2deb6cba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.136 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[05c7d38a-3001-4b64-890d-65267771c1d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674898, 'reachable_time': 28829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239118, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.155 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[abd12317-78af-48cb-8a52-e58a95c4bd6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:c072'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 674898, 'tstamp': 674898}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239119, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.180 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5c34bb47-45e4-485f-9039-50a121557d96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674898, 'reachable_time': 28829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239120, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.220 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[64444a9f-3e15-44a8-8dac-4264c76e996e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.305 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[93b9242d-4aa1-4d30-a284-a4e9e31b968a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.307 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.307 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.308 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28412826-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:06 np0005539504 NetworkManager[55210]: <info>  [1764401286.3107] manager: (tap28412826-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Nov 29 02:28:06 np0005539504 kernel: tap28412826-50: entered promiscuous mode
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.313 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28412826-50, col_values=(('external_ids', {'iface-id': '2abf732f-8f8c-470e-b6e2-def265b14d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.315 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:06 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:06Z|00543|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.318 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.320 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9d72e890-7f45-4e10-bf34-0fa81722df14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.320 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:28:06 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:06.321 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'env', 'PROCESS_TAG=haproxy-28412826-5463-46e4-95cb-a7d788b1ab15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28412826-5463-46e4-95cb-a7d788b1ab15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.328 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.634 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.646 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401286.6456692, e53352bf-9c18-47a3-887b-ec5808266bd1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.646 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] VM Started (Lifecycle Event)#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.664 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.669 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401286.6484559, e53352bf-9c18-47a3-887b-ec5808266bd1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.670 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.691 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.698 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:06 np0005539504 nova_compute[187152]: 2025-11-29 07:28:06.728 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:06 np0005539504 podman[239158]: 2025-11-29 07:28:06.735729567 +0000 UTC m=+0.062114285 container create d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 29 02:28:06 np0005539504 systemd[1]: Started libpod-conmon-d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21.scope.
Nov 29 02:28:06 np0005539504 podman[239158]: 2025-11-29 07:28:06.703253867 +0000 UTC m=+0.029638595 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:28:06 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:28:06 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb6aef20ac78eb24e89f5a72c7a680c48e0d24aee05ada367e2960e769a20619/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:28:06 np0005539504 podman[239158]: 2025-11-29 07:28:06.838974373 +0000 UTC m=+0.165359131 container init d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:28:06 np0005539504 podman[239158]: 2025-11-29 07:28:06.847468811 +0000 UTC m=+0.173853549 container start d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:28:06 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [NOTICE]   (239177) : New worker (239179) forked
Nov 29 02:28:06 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [NOTICE]   (239177) : Loading success.
Nov 29 02:28:07 np0005539504 nova_compute[187152]: 2025-11-29 07:28:07.042 187156 DEBUG nova.network.neutron [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Updated VIF entry in instance network info cache for port d83ec027-8cd9-43f6-9b97-5c3794b908e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:28:07 np0005539504 nova_compute[187152]: 2025-11-29 07:28:07.043 187156 DEBUG nova.network.neutron [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Updating instance_info_cache with network_info: [{"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:07 np0005539504 nova_compute[187152]: 2025-11-29 07:28:07.063 187156 DEBUG oslo_concurrency.lockutils [req-b848a3df-12a4-4f1c-b3fc-50c58905a156 req-44a4430d-580c-4880-814a-10a76494ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e53352bf-9c18-47a3-887b-ec5808266bd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:07 np0005539504 nova_compute[187152]: 2025-11-29 07:28:07.445 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:08Z|00544|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 02:28:08 np0005539504 nova_compute[187152]: 2025-11-29 07:28:08.166 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:09 np0005539504 podman[239188]: 2025-11-29 07:28:09.746526782 +0000 UTC m=+0.069259707 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:28:11 np0005539504 nova_compute[187152]: 2025-11-29 07:28:11.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:11 np0005539504 nova_compute[187152]: 2025-11-29 07:28:11.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:12 np0005539504 nova_compute[187152]: 2025-11-29 07:28:12.448 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:12 np0005539504 nova_compute[187152]: 2025-11-29 07:28:12.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:12 np0005539504 nova_compute[187152]: 2025-11-29 07:28:12.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:28:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:14.686 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:46:33 2001:db8:0:1:f816:3eff:feb9:4633 2001:db8::f816:3eff:feb9:4633'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feb9:4633/64 2001:db8::f816:3eff:feb9:4633/64', 'neutron:device_id': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f0ce4da0-40ec-44ef-8179-4cbfad9b57f1) old=Port_Binding(mac=['fa:16:3e:b9:46:33 2001:db8::f816:3eff:feb9:4633'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb9:4633/64', 'neutron:device_id': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:14.690 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e updated#033[00m
Nov 29 02:28:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:14.693 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:14.695 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac8b2e4-2484-4be3-860a-d86e7ef67629]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:14 np0005539504 nova_compute[187152]: 2025-11-29 07:28:14.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.819 187156 DEBUG nova.compute.manager [req-ea0f36c9-a5cb-49d3-817b-a7ab1a0ce8c0 req-deeb8093-8d68-4921-ae6d-253f0b99495b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.820 187156 DEBUG oslo_concurrency.lockutils [req-ea0f36c9-a5cb-49d3-817b-a7ab1a0ce8c0 req-deeb8093-8d68-4921-ae6d-253f0b99495b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.820 187156 DEBUG oslo_concurrency.lockutils [req-ea0f36c9-a5cb-49d3-817b-a7ab1a0ce8c0 req-deeb8093-8d68-4921-ae6d-253f0b99495b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.820 187156 DEBUG oslo_concurrency.lockutils [req-ea0f36c9-a5cb-49d3-817b-a7ab1a0ce8c0 req-deeb8093-8d68-4921-ae6d-253f0b99495b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.821 187156 DEBUG nova.compute.manager [req-ea0f36c9-a5cb-49d3-817b-a7ab1a0ce8c0 req-deeb8093-8d68-4921-ae6d-253f0b99495b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Processing event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.822 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.826 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401295.8267667, e53352bf-9c18-47a3-887b-ec5808266bd1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.827 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.829 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.834 187156 INFO nova.virt.libvirt.driver [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Instance spawned successfully.#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.834 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.854 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.861 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.865 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.865 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.866 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.866 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.866 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.867 187156 DEBUG nova.virt.libvirt.driver [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.897 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.974 187156 INFO nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Took 21.65 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:28:15 np0005539504 nova_compute[187152]: 2025-11-29 07:28:15.975 187156 DEBUG nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:16 np0005539504 nova_compute[187152]: 2025-11-29 07:28:16.073 187156 INFO nova.compute.manager [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Took 22.77 seconds to build instance.#033[00m
Nov 29 02:28:16 np0005539504 nova_compute[187152]: 2025-11-29 07:28:16.091 187156 DEBUG oslo_concurrency.lockutils [None req-769c84cb-b665-43e5-9b5f-968b97028e68 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:16 np0005539504 nova_compute[187152]: 2025-11-29 07:28:16.642 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.452 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.945 187156 DEBUG nova.compute.manager [req-462c5ee8-3cc9-46f5-b639-ef7ea5dfcc61 req-88965d87-eb9b-4640-9b34-d654cce62c1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.946 187156 DEBUG oslo_concurrency.lockutils [req-462c5ee8-3cc9-46f5-b639-ef7ea5dfcc61 req-88965d87-eb9b-4640-9b34-d654cce62c1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.946 187156 DEBUG oslo_concurrency.lockutils [req-462c5ee8-3cc9-46f5-b639-ef7ea5dfcc61 req-88965d87-eb9b-4640-9b34-d654cce62c1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.946 187156 DEBUG oslo_concurrency.lockutils [req-462c5ee8-3cc9-46f5-b639-ef7ea5dfcc61 req-88965d87-eb9b-4640-9b34-d654cce62c1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.946 187156 DEBUG nova.compute.manager [req-462c5ee8-3cc9-46f5-b639-ef7ea5dfcc61 req-88965d87-eb9b-4640-9b34-d654cce62c1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] No waiting events found dispatching network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.947 187156 WARNING nova.compute.manager [req-462c5ee8-3cc9-46f5-b639-ef7ea5dfcc61 req-88965d87-eb9b-4640-9b34-d654cce62c1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received unexpected event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.971 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.972 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.973 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:17 np0005539504 nova_compute[187152]: 2025-11-29 07:28:17.974 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.089 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.178 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.187 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.291 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.519 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.523 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5544MB free_disk=73.09010314941406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.523 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.524 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.611 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance e53352bf-9c18-47a3-887b-ec5808266bd1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.611 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.612 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.664 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.684 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.713 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:28:18 np0005539504 nova_compute[187152]: 2025-11-29 07:28:18.713 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:20 np0005539504 nova_compute[187152]: 2025-11-29 07:28:20.715 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:20 np0005539504 nova_compute[187152]: 2025-11-29 07:28:20.717 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:28:20 np0005539504 nova_compute[187152]: 2025-11-29 07:28:20.742 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:28:20 np0005539504 nova_compute[187152]: 2025-11-29 07:28:20.744 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:21 np0005539504 nova_compute[187152]: 2025-11-29 07:28:21.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:21 np0005539504 podman[239220]: 2025-11-29 07:28:21.721490681 +0000 UTC m=+0.060171233 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:28:21 np0005539504 podman[239221]: 2025-11-29 07:28:21.742891535 +0000 UTC m=+0.074586799 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:28:21 np0005539504 podman[239222]: 2025-11-29 07:28:21.762235923 +0000 UTC m=+0.081583537 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:28:21 np0005539504 nova_compute[187152]: 2025-11-29 07:28:21.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:22 np0005539504 nova_compute[187152]: 2025-11-29 07:28:22.500 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:22.969 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:22.970 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:22.970 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:23 np0005539504 nova_compute[187152]: 2025-11-29 07:28:23.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.789 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.791 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.806 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:28:25 np0005539504 podman[239284]: 2025-11-29 07:28:25.887566039 +0000 UTC m=+0.084722612 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.910 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.911 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.918 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:28:25 np0005539504 nova_compute[187152]: 2025-11-29 07:28:25.919 187156 INFO nova.compute.claims [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:28:25 np0005539504 podman[239309]: 2025-11-29 07:28:25.976107511 +0000 UTC m=+0.083011605 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.070 187156 DEBUG nova.compute.provider_tree [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.114 187156 DEBUG nova.scheduler.client.report [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.137 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.138 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.192 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.193 187156 DEBUG nova.network.neutron [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.216 187156 INFO nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.238 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.376 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.377 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.378 187156 INFO nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Creating image(s)#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.379 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "/var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.379 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "/var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.380 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "/var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.394 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.455 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.457 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.458 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.476 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.545 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.547 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.584 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.586 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.586 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.610 187156 DEBUG nova.policy [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.647 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.648 187156 DEBUG nova.virt.disk.api [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Checking if we can resize image /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.648 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.704 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.706 187156 DEBUG nova.virt.disk.api [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Cannot resize image /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.706 187156 DEBUG nova.objects.instance [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'migration_context' on Instance uuid 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.720 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.721 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Ensure instance console log exists: /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.722 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.722 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:26 np0005539504 nova_compute[187152]: 2025-11-29 07:28:26.723 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:27 np0005539504 nova_compute[187152]: 2025-11-29 07:28:27.502 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:27 np0005539504 nova_compute[187152]: 2025-11-29 07:28:27.721 187156 DEBUG nova.network.neutron [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Successfully created port: 93283ced-759a-40ca-bc21-9afde2a8218f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:28:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:28Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:a1:8d 10.100.0.8
Nov 29 02:28:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:28Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:a1:8d 10.100.0.8
Nov 29 02:28:29 np0005539504 nova_compute[187152]: 2025-11-29 07:28:29.722 187156 DEBUG nova.network.neutron [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Successfully updated port: 93283ced-759a-40ca-bc21-9afde2a8218f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:28:29 np0005539504 nova_compute[187152]: 2025-11-29 07:28:29.749 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "refresh_cache-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:29 np0005539504 nova_compute[187152]: 2025-11-29 07:28:29.750 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquired lock "refresh_cache-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:29 np0005539504 nova_compute[187152]: 2025-11-29 07:28:29.750 187156 DEBUG nova.network.neutron [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:28:29 np0005539504 nova_compute[187152]: 2025-11-29 07:28:29.985 187156 DEBUG nova.network.neutron [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.333 187156 DEBUG nova.network.neutron [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Updating instance_info_cache with network_info: [{"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.352 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Releasing lock "refresh_cache-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.353 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Instance network_info: |[{"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.355 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Start _get_guest_xml network_info=[{"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.360 187156 WARNING nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.364 187156 DEBUG nova.virt.libvirt.host [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.365 187156 DEBUG nova.virt.libvirt.host [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.369 187156 DEBUG nova.virt.libvirt.host [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.370 187156 DEBUG nova.virt.libvirt.host [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.371 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.372 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.372 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.372 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.373 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.373 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.373 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.373 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.374 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.374 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.374 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.374 187156 DEBUG nova.virt.hardware [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.378 187156 DEBUG nova.virt.libvirt.vif [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-696958466',display_name='tempest-ServerRescueNegativeTestJSON-server-696958466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-696958466',id=136,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-t927j07o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:26Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=1e3a87bf-6c8e-413e-af10-e61e32ad5d7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.378 187156 DEBUG nova.network.os_vif_util [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.379 187156 DEBUG nova.network.os_vif_util [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.380 187156 DEBUG nova.objects.instance [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.398 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <uuid>1e3a87bf-6c8e-413e-af10-e61e32ad5d7d</uuid>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <name>instance-00000088</name>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-696958466</nova:name>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:28:31</nova:creationTime>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:user uuid="4863fb992d4c48de9a92f63ffb1174a8">tempest-ServerRescueNegativeTestJSON-1892401049-project-member</nova:user>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:project uuid="5d1e4f74add34e9b9a2084bd9586db0c">tempest-ServerRescueNegativeTestJSON-1892401049</nova:project>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        <nova:port uuid="93283ced-759a-40ca-bc21-9afde2a8218f">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <entry name="serial">1e3a87bf-6c8e-413e-af10-e61e32ad5d7d</entry>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <entry name="uuid">1e3a87bf-6c8e-413e-af10-e61e32ad5d7d</entry>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.config"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:68:7a:d3"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <target dev="tap93283ced-75"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/console.log" append="off"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:28:31 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:28:31 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:28:31 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:28:31 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.400 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Preparing to wait for external event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.401 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.401 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.401 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.402 187156 DEBUG nova.virt.libvirt.vif [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-696958466',display_name='tempest-ServerRescueNegativeTestJSON-server-696958466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-696958466',id=136,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-t927j07o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:26Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=1e3a87bf-6c8e-413e-af10-e61e32ad5d7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.403 187156 DEBUG nova.network.os_vif_util [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.403 187156 DEBUG nova.network.os_vif_util [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.404 187156 DEBUG os_vif [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.405 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.406 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.411 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.411 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93283ced-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.412 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap93283ced-75, col_values=(('external_ids', {'iface-id': '93283ced-759a-40ca-bc21-9afde2a8218f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:7a:d3', 'vm-uuid': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.414 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539504 NetworkManager[55210]: <info>  [1764401311.4155] manager: (tap93283ced-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.417 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.421 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.423 187156 INFO os_vif [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75')#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.484 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.485 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.486 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No VIF found with MAC fa:16:3e:68:7a:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.487 187156 INFO nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Using config drive#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.586 187156 DEBUG nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-changed-93283ced-759a-40ca-bc21-9afde2a8218f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.587 187156 DEBUG nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Refreshing instance network info cache due to event network-changed-93283ced-759a-40ca-bc21-9afde2a8218f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.588 187156 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.589 187156 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.589 187156 DEBUG nova.network.neutron [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Refreshing network info cache for port 93283ced-759a-40ca-bc21-9afde2a8218f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.911 187156 INFO nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Creating config drive at /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.config#033[00m
Nov 29 02:28:31 np0005539504 nova_compute[187152]: 2025-11-29 07:28:31.917 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0m3f5qde execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.062 187156 DEBUG oslo_concurrency.processutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0m3f5qde" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:32 np0005539504 kernel: tap93283ced-75: entered promiscuous mode
Nov 29 02:28:32 np0005539504 NetworkManager[55210]: <info>  [1764401312.1549] manager: (tap93283ced-75): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Nov 29 02:28:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:32Z|00545|binding|INFO|Claiming lport 93283ced-759a-40ca-bc21-9afde2a8218f for this chassis.
Nov 29 02:28:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:32Z|00546|binding|INFO|93283ced-759a-40ca-bc21-9afde2a8218f: Claiming fa:16:3e:68:7a:d3 10.100.0.14
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.155 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.160 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.167 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.174 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:7a:d3 10.100.0.14'], port_security=['fa:16:3e:68:7a:d3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008329a1-d4dc-4cfb-be68-95f658d9813d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b04b69be-f431-4979-89c6-4e231888644a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7e71bac-297c-4031-8579-254c834f5859, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=93283ced-759a-40ca-bc21-9afde2a8218f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.176 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 93283ced-759a-40ca-bc21-9afde2a8218f in datapath 008329a1-d4dc-4cfb-be68-95f658d9813d bound to our chassis#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.177 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 008329a1-d4dc-4cfb-be68-95f658d9813d#033[00m
Nov 29 02:28:32 np0005539504 systemd-udevd[239390]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.243 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dc989ba8-6e02-4d01-9e3a-8dcdb790807b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.245 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap008329a1-d1 in ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.246 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap008329a1-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.246 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b659120a-9c76-47ad-bd16-f3a34edcc125]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.247 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[db681477-bc7d-407e-82f8-594c9f0b143f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 systemd-machined[153423]: New machine qemu-71-instance-00000088.
Nov 29 02:28:32 np0005539504 NetworkManager[55210]: <info>  [1764401312.2566] device (tap93283ced-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:28:32 np0005539504 NetworkManager[55210]: <info>  [1764401312.2578] device (tap93283ced-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:28:32 np0005539504 systemd[1]: Started Virtual Machine qemu-71-instance-00000088.
Nov 29 02:28:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:32Z|00547|binding|INFO|Setting lport 93283ced-759a-40ca-bc21-9afde2a8218f ovn-installed in OVS
Nov 29 02:28:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:32Z|00548|binding|INFO|Setting lport 93283ced-759a-40ca-bc21-9afde2a8218f up in Southbound
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.273 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d819a2-01fe-42e3-a3d7-0b740c078c30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.286 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[952eefc1-379e-452b-b08d-78c4d9948dfe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.333 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bacd2f-b440-4667-8d98-561ad3d4a3e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 systemd-udevd[239394]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:32 np0005539504 NetworkManager[55210]: <info>  [1764401312.3420] manager: (tap008329a1-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/246)
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.340 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbbd5da-0e1c-4110-adfd-1181aa93b247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.392 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[eff89d41-3b86-4359-a09e-03325cb8a27c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.397 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[37f2d5a0-ef56-4627-ac96-e1a2b754f457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 NetworkManager[55210]: <info>  [1764401312.4276] device (tap008329a1-d0): carrier: link connected
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.437 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7c98f6d6-f198-466d-9c8d-05725610a466]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.458 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fee91f7b-2085-48df-a0fc-df2f47c8f585]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008329a1-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:0d:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677530, 'reachable_time': 32531, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239423, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.481 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6aa265-0e1f-4251-818b-6b0eb9e80b0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:da9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 677530, 'tstamp': 677530}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239424, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.506 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[580ab11e-afb4-496d-ac2e-d6aa39a70195]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008329a1-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:0d:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677530, 'reachable_time': 32531, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239425, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.548 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d06bd86e-e975-42d3-a798-4b42856a0130]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.623 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3221ce65-c57f-433b-8ec0-9c1ecb2cee06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.625 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008329a1-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.626 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.626 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap008329a1-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.629 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 kernel: tap008329a1-d0: entered promiscuous mode
Nov 29 02:28:32 np0005539504 NetworkManager[55210]: <info>  [1764401312.6300] manager: (tap008329a1-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.632 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.639 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap008329a1-d0, col_values=(('external_ids', {'iface-id': 'd36011d9-2f3d-4616-b3ba-40f6405df460'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:32Z|00549|binding|INFO|Releasing lport d36011d9-2f3d-4616-b3ba-40f6405df460 from this chassis (sb_readonly=0)
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.643 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.644 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[40e507b4-58aa-4354-b925-24f7bdedc664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.645 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-008329a1-d4dc-4cfb-be68-95f658d9813d
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 008329a1-d4dc-4cfb-be68-95f658d9813d
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:28:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:32.646 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'env', 'PROCESS_TAG=haproxy-008329a1-d4dc-4cfb-be68-95f658d9813d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/008329a1-d4dc-4cfb-be68-95f658d9813d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:28:32 np0005539504 nova_compute[187152]: 2025-11-29 07:28:32.660 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:32 np0005539504 podman[239432]: 2025-11-29 07:28:32.769736782 +0000 UTC m=+0.090172687 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.029 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401313.0279114, 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.029 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] VM Started (Lifecycle Event)#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.065 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.069 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401313.0282068, 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.070 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:28:33 np0005539504 podman[239484]: 2025-11-29 07:28:33.074127267 +0000 UTC m=+0.057497812 container create 068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.098 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.106 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:33 np0005539504 systemd[1]: Started libpod-conmon-068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1.scope.
Nov 29 02:28:33 np0005539504 podman[239484]: 2025-11-29 07:28:33.039293033 +0000 UTC m=+0.022663608 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.137 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:33 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:28:33 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d285e81b70b288fccf791228e02e7cd5bebf335101d655b9a1f6d98e0d25770/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:28:33 np0005539504 podman[239484]: 2025-11-29 07:28:33.177858386 +0000 UTC m=+0.161228961 container init 068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:28:33 np0005539504 podman[239484]: 2025-11-29 07:28:33.183829496 +0000 UTC m=+0.167200051 container start 068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:28:33 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [NOTICE]   (239503) : New worker (239505) forked
Nov 29 02:28:33 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [NOTICE]   (239503) : Loading success.
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.410 187156 DEBUG nova.network.neutron [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Updated VIF entry in instance network info cache for port 93283ced-759a-40ca-bc21-9afde2a8218f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.411 187156 DEBUG nova.network.neutron [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Updating instance_info_cache with network_info: [{"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.436 187156 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.500 187156 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.501 187156 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.502 187156 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.502 187156 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.503 187156 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Processing event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.503 187156 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.503 187156 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.504 187156 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.504 187156 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.505 187156 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] No waiting events found dispatching network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.505 187156 WARNING nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received unexpected event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.506 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.511 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401313.511271, 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.512 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.515 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.519 187156 INFO nova.virt.libvirt.driver [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Instance spawned successfully.#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.519 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.550 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.561 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.566 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.567 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.568 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.568 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.569 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.569 187156 DEBUG nova.virt.libvirt.driver [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.607 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.716 187156 INFO nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Took 7.34 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.717 187156 DEBUG nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.846 187156 INFO nova.compute.manager [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Took 7.98 seconds to build instance.#033[00m
Nov 29 02:28:33 np0005539504 nova_compute[187152]: 2025-11-29 07:28:33.878 187156 DEBUG oslo_concurrency.lockutils [None req-99b704cb-5674-4541-ab76-a20b172306b4 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:36 np0005539504 nova_compute[187152]: 2025-11-29 07:28:36.415 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:36 np0005539504 nova_compute[187152]: 2025-11-29 07:28:36.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:38 np0005539504 nova_compute[187152]: 2025-11-29 07:28:38.124 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:38 np0005539504 nova_compute[187152]: 2025-11-29 07:28:38.125 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:38 np0005539504 nova_compute[187152]: 2025-11-29 07:28:38.125 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:38 np0005539504 nova_compute[187152]: 2025-11-29 07:28:38.125 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:38 np0005539504 nova_compute[187152]: 2025-11-29 07:28:38.125 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.249 187156 INFO nova.compute.manager [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Terminating instance#033[00m
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.355 187156 DEBUG nova.compute.manager [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:28:39 np0005539504 kernel: tapd83ec027-8c (unregistering): left promiscuous mode
Nov 29 02:28:39 np0005539504 NetworkManager[55210]: <info>  [1764401319.3850] device (tapd83ec027-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:28:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:39Z|00550|binding|INFO|Releasing lport d83ec027-8cd9-43f6-9b97-5c3794b908e3 from this chassis (sb_readonly=0)
Nov 29 02:28:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:39Z|00551|binding|INFO|Setting lport d83ec027-8cd9-43f6-9b97-5c3794b908e3 down in Southbound
Nov 29 02:28:39 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:39Z|00552|binding|INFO|Removing iface tapd83ec027-8c ovn-installed in OVS
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.399 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.414 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:39.418 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:a1:8d 10.100.0.8'], port_security=['fa:16:3e:02:a1:8d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e53352bf-9c18-47a3-887b-ec5808266bd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=d83ec027-8cd9-43f6-9b97-5c3794b908e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.418 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:39.422 104164 INFO neutron.agent.ovn.metadata.agent [-] Port d83ec027-8cd9-43f6-9b97-5c3794b908e3 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 unbound from our chassis#033[00m
Nov 29 02:28:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:39.424 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28412826-5463-46e4-95cb-a7d788b1ab15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:39.426 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[efcad457-51c4-4987-92e8-b1e237126f2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:39.427 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace which is not needed anymore#033[00m
Nov 29 02:28:39 np0005539504 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000085.scope: Deactivated successfully.
Nov 29 02:28:39 np0005539504 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000085.scope: Consumed 14.143s CPU time.
Nov 29 02:28:39 np0005539504 systemd-machined[153423]: Machine qemu-70-instance-00000085 terminated.
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.635 187156 INFO nova.virt.libvirt.driver [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Instance destroyed successfully.#033[00m
Nov 29 02:28:39 np0005539504 nova_compute[187152]: 2025-11-29 07:28:39.636 187156 DEBUG nova.objects.instance [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'resources' on Instance uuid e53352bf-9c18-47a3-887b-ec5808266bd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:40 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [NOTICE]   (239177) : haproxy version is 2.8.14-c23fe91
Nov 29 02:28:40 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [NOTICE]   (239177) : path to executable is /usr/sbin/haproxy
Nov 29 02:28:40 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [WARNING]  (239177) : Exiting Master process...
Nov 29 02:28:40 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [WARNING]  (239177) : Exiting Master process...
Nov 29 02:28:40 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [ALERT]    (239177) : Current worker (239179) exited with code 143 (Terminated)
Nov 29 02:28:40 np0005539504 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[239173]: [WARNING]  (239177) : All workers exited. Exiting... (0)
Nov 29 02:28:40 np0005539504 systemd[1]: libpod-d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21.scope: Deactivated successfully.
Nov 29 02:28:40 np0005539504 podman[239536]: 2025-11-29 07:28:40.617031145 +0000 UTC m=+1.072904907 container died d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.795 187156 DEBUG nova.virt.libvirt.vif [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:27:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-881623161',display_name='tempest-ListServerFiltersTestJSON-instance-881623161',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-881623161',id=133,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-0d45dxkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:16Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=e53352bf-9c18-47a3-887b-ec5808266bd1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.796 187156 DEBUG nova.network.os_vif_util [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "address": "fa:16:3e:02:a1:8d", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83ec027-8c", "ovs_interfaceid": "d83ec027-8cd9-43f6-9b97-5c3794b908e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.797 187156 DEBUG nova.network.os_vif_util [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.797 187156 DEBUG os_vif [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.800 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.801 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd83ec027-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.806 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.809 187156 INFO os_vif [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:a1:8d,bridge_name='br-int',has_traffic_filtering=True,id=d83ec027-8cd9-43f6-9b97-5c3794b908e3,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83ec027-8c')#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.810 187156 INFO nova.virt.libvirt.driver [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Deleting instance files /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1_del#033[00m
Nov 29 02:28:40 np0005539504 nova_compute[187152]: 2025-11-29 07:28:40.811 187156 INFO nova.virt.libvirt.driver [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Deletion of /var/lib/nova/instances/e53352bf-9c18-47a3-887b-ec5808266bd1_del complete#033[00m
Nov 29 02:28:40 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21-userdata-shm.mount: Deactivated successfully.
Nov 29 02:28:40 np0005539504 systemd[1]: var-lib-containers-storage-overlay-fb6aef20ac78eb24e89f5a72c7a680c48e0d24aee05ada367e2960e769a20619-merged.mount: Deactivated successfully.
Nov 29 02:28:40 np0005539504 podman[239567]: 2025-11-29 07:28:40.958698549 +0000 UTC m=+0.306025771 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:28:41 np0005539504 podman[239536]: 2025-11-29 07:28:41.186855011 +0000 UTC m=+1.642728773 container cleanup d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:28:41 np0005539504 systemd[1]: libpod-conmon-d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21.scope: Deactivated successfully.
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.446 187156 DEBUG nova.compute.manager [req-8e589058-c1db-4c77-b9ef-2d4612f8c0e4 req-3afd4dfd-e9dc-45dd-83d3-1da9ff6a4631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-vif-unplugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.447 187156 DEBUG oslo_concurrency.lockutils [req-8e589058-c1db-4c77-b9ef-2d4612f8c0e4 req-3afd4dfd-e9dc-45dd-83d3-1da9ff6a4631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.447 187156 DEBUG oslo_concurrency.lockutils [req-8e589058-c1db-4c77-b9ef-2d4612f8c0e4 req-3afd4dfd-e9dc-45dd-83d3-1da9ff6a4631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.447 187156 DEBUG oslo_concurrency.lockutils [req-8e589058-c1db-4c77-b9ef-2d4612f8c0e4 req-3afd4dfd-e9dc-45dd-83d3-1da9ff6a4631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.448 187156 DEBUG nova.compute.manager [req-8e589058-c1db-4c77-b9ef-2d4612f8c0e4 req-3afd4dfd-e9dc-45dd-83d3-1da9ff6a4631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] No waiting events found dispatching network-vif-unplugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.448 187156 DEBUG nova.compute.manager [req-8e589058-c1db-4c77-b9ef-2d4612f8c0e4 req-3afd4dfd-e9dc-45dd-83d3-1da9ff6a4631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-vif-unplugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:28:41 np0005539504 podman[239601]: 2025-11-29 07:28:41.472154304 +0000 UTC m=+0.252123736 container remove d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.477 187156 INFO nova.compute.manager [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Took 2.12 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.478 187156 DEBUG oslo.service.loopingcall [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.478 187156 DEBUG nova.compute.manager [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.478 187156 DEBUG nova.network.neutron [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.478 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7968e49d-4f14-40be-9e80-b528fa2136db]: (4, ('Sat Nov 29 07:28:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21)\nd22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21\nSat Nov 29 07:28:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (d22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21)\nd22d0b8a1e543e0fe8ab6ba4f209bd266850803e4b741aefd212e99b63765b21\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.482 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a96344e6-4546-497a-bf75-3928ec888609]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.484 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.552 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:41 np0005539504 kernel: tap28412826-50: left promiscuous mode
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.569 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.574 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3288679a-e807-464c-9fba-94dfed1faa7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.595 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[98b587b1-6711-4301-83da-361014f5459b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.596 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[149a52f1-86d4-4e8e-a5f6-7b795e3f9e83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.613 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ad23180d-b25c-4f66-81d1-4b0bdac430e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674889, 'reachable_time': 15703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239617, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.616 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:28:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:41.616 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[42bc48e2-aa89-4c30-9611-597061debe7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:41 np0005539504 systemd[1]: run-netns-ovnmeta\x2d28412826\x2d5463\x2d46e4\x2d95cb\x2da7d788b1ab15.mount: Deactivated successfully.
Nov 29 02:28:41 np0005539504 nova_compute[187152]: 2025-11-29 07:28:41.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:43.854 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:43.857 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.868 187156 DEBUG nova.compute.manager [req-cd3cc453-0dc8-4f64-8712-2a2f24a062eb req-957a8e2c-696c-410e-aeb6-2d6cbf29109d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-vif-deleted-d83ec027-8cd9-43f6-9b97-5c3794b908e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.869 187156 INFO nova.compute.manager [req-cd3cc453-0dc8-4f64-8712-2a2f24a062eb req-957a8e2c-696c-410e-aeb6-2d6cbf29109d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Neutron deleted interface d83ec027-8cd9-43f6-9b97-5c3794b908e3; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.870 187156 DEBUG nova.network.neutron [req-cd3cc453-0dc8-4f64-8712-2a2f24a062eb req-957a8e2c-696c-410e-aeb6-2d6cbf29109d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.873 187156 DEBUG nova.compute.manager [req-49b75b0b-f4aa-4019-9c03-731d7aee88ef req-75d57596-b34e-4e25-99d5-76b2eb7d7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.873 187156 DEBUG oslo_concurrency.lockutils [req-49b75b0b-f4aa-4019-9c03-731d7aee88ef req-75d57596-b34e-4e25-99d5-76b2eb7d7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.874 187156 DEBUG oslo_concurrency.lockutils [req-49b75b0b-f4aa-4019-9c03-731d7aee88ef req-75d57596-b34e-4e25-99d5-76b2eb7d7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.874 187156 DEBUG oslo_concurrency.lockutils [req-49b75b0b-f4aa-4019-9c03-731d7aee88ef req-75d57596-b34e-4e25-99d5-76b2eb7d7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.874 187156 DEBUG nova.compute.manager [req-49b75b0b-f4aa-4019-9c03-731d7aee88ef req-75d57596-b34e-4e25-99d5-76b2eb7d7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] No waiting events found dispatching network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.874 187156 WARNING nova.compute.manager [req-49b75b0b-f4aa-4019-9c03-731d7aee88ef req-75d57596-b34e-4e25-99d5-76b2eb7d7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Received unexpected event network-vif-plugged-d83ec027-8cd9-43f6-9b97-5c3794b908e3 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.887 187156 DEBUG nova.network.neutron [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.924 187156 INFO nova.compute.manager [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Took 2.45 seconds to deallocate network for instance.#033[00m
Nov 29 02:28:43 np0005539504 nova_compute[187152]: 2025-11-29 07:28:43.930 187156 DEBUG nova.compute.manager [req-cd3cc453-0dc8-4f64-8712-2a2f24a062eb req-957a8e2c-696c-410e-aeb6-2d6cbf29109d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Detach interface failed, port_id=d83ec027-8cd9-43f6-9b97-5c3794b908e3, reason: Instance e53352bf-9c18-47a3-887b-ec5808266bd1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:28:45 np0005539504 nova_compute[187152]: 2025-11-29 07:28:45.803 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:28:45.860 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:46 np0005539504 nova_compute[187152]: 2025-11-29 07:28:46.645 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:46 np0005539504 nova_compute[187152]: 2025-11-29 07:28:46.646 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:46 np0005539504 nova_compute[187152]: 2025-11-29 07:28:46.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:46 np0005539504 nova_compute[187152]: 2025-11-29 07:28:46.746 187156 DEBUG nova.compute.provider_tree [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:28:46 np0005539504 nova_compute[187152]: 2025-11-29 07:28:46.821 187156 DEBUG nova.scheduler.client.report [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:28:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:47.980 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000088', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'hostId': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:28:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:47.981 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.004 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/cpu volume: 11260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0388ad2a-c457-40de-8208-5ce7e8a88853', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11260000000, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'timestamp': '2025-11-29T07:28:47.981798', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0b799ae0-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.939171878, 'message_signature': 'd7bf257b7a2b14695ddba81fdc4391924cabf7804870d024649296c1e42c94cf'}]}, 'timestamp': '2025-11-29 07:28:48.006489', '_unique_id': 'd20092f4cb8148959ffa257cbcca5981'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.011 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.011 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>]
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.014 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d / tap93283ced-75 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.015 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a03aa899-3ded-4dcf-a061-8b5ab1172a34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.011596', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b7b0bbe-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '60c7ce01f76d66242b263009d74eda018a1ebbef4fe4ef74955234f082d47c8f'}]}, 'timestamp': '2025-11-29 07:28:48.015471', '_unique_id': 'c4fdd5816d534dc0866a48e8bf1f052f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.016 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.017 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.052 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.053 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bc0f2f4-3ae2-460e-aca8-11d8b4c115dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.017242', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b80e264-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': 'ae138b40b542b222a02e2b40c02aaa4e14b4d26d446b6c5d2ffc2c8f019e6959'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.017242', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b80efb6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '24cea1c8f3b5d1855cecd1431534b237ca29970a57b7fd6c261a3055e69fd09c'}]}, 'timestamp': '2025-11-29 07:28:48.053975', '_unique_id': 'ef138eeca96f4e538e4736c9d044d4e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.068 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.068 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '098fdd35-b675-4c09-8e36-58d83a16840f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.055814', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b832fd8-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.990511744, 'message_signature': 'b64773eee011045fb1dd9978b5868a2486f8ffb2326dc08fc5bdae9ebdcbfb7d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.055814', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b833ae6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.990511744, 'message_signature': '0c9d2a0b0e28d86b2a2f6750dee7ab6a5f807d98adf093f7cc9c9cf055d5343c'}]}, 'timestamp': '2025-11-29 07:28:48.068999', '_unique_id': '9e111f6b7edc4676965419ba7f04181d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.070 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7484feab-763c-44f6-91d3-0dbdb52a7c28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.070799', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b838bd6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': 'f8ca0b334383dcf9f3a656fe46b5a3e4c2fdae416a5307a65489008f9a87203c'}]}, 'timestamp': '2025-11-29 07:28:48.071102', '_unique_id': '8b738747c74d4be9be5b463f0e409f28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.074 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf99e748-d174-444e-9edb-84abb2dfb9d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.073972', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b840e8a-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '6349f5362791b3ac77215144320a78f8ed52f8cd0b00e444f9c46f8b7da584ad'}]}, 'timestamp': '2025-11-29 07:28:48.074606', '_unique_id': '2328c5e5a57b4bec9c01769615dbea37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.077 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.077 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>]
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.078 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.078 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>]
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.078 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba720fb0-f5a6-48f4-8d42-cc375e2672fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.078868', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b84cc4e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '49a4460a0bd28e41071e68284fa9c90511174cd2a9c46e60b8b25118af3661c6'}]}, 'timestamp': '2025-11-29 07:28:48.079519', '_unique_id': 'cb729c14f90b4affaad95caa3b45f8f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.082 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6273e9e5-a653-4f5e-8ec8-6ed6f9bc4414', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.082470', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b8558b2-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '90ef413d50d4d8759c70acae495e74e81a271a18fdd063e80ef62942e2b95eab'}]}, 'timestamp': '2025-11-29 07:28:48.083039', '_unique_id': 'e1ee67adf07f429abaef559f1bcc09b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.085 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0a4c27-ba84-4185-87c3-e422684b1c44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.085889', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b85dcba-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '7226e865e414aee4296cba9385c353906c625c5846ea97ae3eecc83c5bba1c00'}]}, 'timestamp': '2025-11-29 07:28:48.086394', '_unique_id': 'fc3eb818a03a4092b89691e720cd8cf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.087 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8b3b13a-7632-4ac8-bd4b-57aa19e44c44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.087970', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b862ab2-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '12196968ff85461f7624755e35ed3c9c9bdbe19adc4c45a9afbae9bf35e40610'}]}, 'timestamp': '2025-11-29 07:28:48.088293', '_unique_id': '8cf3cd6dcfa24d2ab604814d2cf3360d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.090 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.write.latency volume: 30394457679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.090 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c6e34af-e742-4f5e-8908-8b540544a983', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30394457679, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.090103', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b867e22-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': 'ad5b58f812fb51842e56153d144b24ac791c1ce29a2f8ca10d0a3d56ed6c95ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.090103', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b868ac0-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '24b2ab0ee9a62ea5148a0f24a47a31584c9293fc847dd3740973cfcf524a455a'}]}, 'timestamp': '2025-11-29 07:28:48.090730', '_unique_id': '5a6df553cc8444eaa585f76dc1f6f47f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.092 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.read.latency volume: 176896396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.092 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.read.latency volume: 5274202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34dafdc4-9d74-4173-b802-90ea22bfd7af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 176896396, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.092316', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b86d5ca-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': 'b1fa1cad0b832faf2edee2dd165981f1f9259ba636a473484f1d8cc6cb215ec5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5274202, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.092316', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b86e0e2-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': 'f4eba9fd4f1995ed78c2ace11f52d15bee4a3142c6d7bcb014c2ecba00ac1b89'}]}, 'timestamp': '2025-11-29 07:28:48.092932', '_unique_id': '0b658a3c7a53431587642326f49e7390'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.094 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.094 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2052692-0014-4ca3-b868-62c82cb9f732', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.094582', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b872d0e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.990511744, 'message_signature': '861a54f41c6a01c184b9d2edc321ce27bd2e606c3960316b8ed464c716bd2cd1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.094582', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b873812-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.990511744, 'message_signature': 'ca835dc57d62d291154c2ae8de72e97d9845af7a4ee0038828a6938963ac18bd'}]}, 'timestamp': '2025-11-29 07:28:48.095167', '_unique_id': '512c9f5a72ad450e96d66f8aaad05a83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.096 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.096 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-696958466>]
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.097 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8923879f-b36d-4bf1-8435-73e56a7b403f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.097220', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b879550-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': 'bcd8d73f930eeabc543ab81ff458c22ffcc5791b128bf034a52c649a15448301'}]}, 'timestamp': '2025-11-29 07:28:48.097578', '_unique_id': 'd3cfa955da0c486c8f9055196ec63ea4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.099 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57cdda68-4b2f-4e30-ba5c-b864b30dc225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'timestamp': '2025-11-29T07:28:48.099123', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0b87dee8-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.939171878, 'message_signature': '6e906d667403990e2fd841fc76697888f7cb04b491966b2e7bd633d1c7b936de'}]}, 'timestamp': '2025-11-29 07:28:48.099487', '_unique_id': 'eb3ed688f19f4cdfac361e270d38cded'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.100 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.101 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.write.bytes volume: 25616384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.101 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a4ca88e-9403-41cc-a1c0-7d3e11e5f3aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25616384, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.101116', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b882d80-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '55cffe0b7853eed707fefbd045c0c9b0fb551d431f742bd99e7c6e5e66af760c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.101116', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b883e1a-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '320cf997704e0d32a47e5770b002b6bd6f3400630cc47e4f770116e492e555fb'}]}, 'timestamp': '2025-11-29 07:28:48.101899', '_unique_id': '4c123976e98a4d95b69c414bf1a17ed7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.103 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.103 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '705d57fa-3514-41db-800b-8152936e40bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.103632', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b888f1e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.990511744, 'message_signature': '1536ca33a1dbc71bdb742141ef77b9243a7b09ddcf58c26e37e6c39108a419cd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.103632', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b889a9a-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.990511744, 'message_signature': '6e857192fdcf589b46632691a0bff8b38bd870ef592bc0416f93818307a12027'}]}, 'timestamp': '2025-11-29 07:28:48.104243', '_unique_id': '47bbf0128bef44f6bc909fe1957794c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.106 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b53112c-de68-4415-a516-dc0b1d28e97f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.105978', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b88eab8-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': 'dfe55e696d6b147364ff032947786b22f57facbdb02ed5402c75ba8ade1f61a0'}]}, 'timestamp': '2025-11-29 07:28:48.106319', '_unique_id': '30409842a07e475294a937b188fe9717'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.107 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.108 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fc50810-419c-4d7c-bbcd-6fbbc4436be3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.107952', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b893752-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '71003564606cf0d83b51d4ce9b93e143fefbe7116791c71673f91e1bf05b9855'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.107952', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b894436-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '7ca93c1262705ae964a2047f0a628126dc44a72b5db0f71deafe8d41d84f1449'}]}, 'timestamp': '2025-11-29 07:28:48.108590', '_unique_id': '52eb1e68cbc34d55bfda66707a50c971'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.110 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '892a446d-d146-40c0-a027-383f3b6f4b33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000088-1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-tap93283ced-75', 'timestamp': '2025-11-29T07:28:48.110252', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'tap93283ced-75', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:68:7a:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93283ced-75'}, 'message_id': '0b899256-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.946290419, 'message_signature': '4fb36a5a1e259f5a64cbd5b289c3e3cafff04fe8611e663dee02f2bc3730c92a'}]}, 'timestamp': '2025-11-29 07:28:48.110608', '_unique_id': '44c4dcecf764455f85a8dc7842a629ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.112 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.write.requests volume: 232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.112 12 DEBUG ceilometer.compute.pollsters [-] 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6be5b625-ffdd-4f66-b4b4-59d2d583c8a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 232, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-vda', 'timestamp': '2025-11-29T07:28:48.112414', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b89e616-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': 'ef6105c6eb6bfa169fb89669e39e80eb6e33509bd81153f2d24260278c15b469'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-sda', 'timestamp': '2025-11-29T07:28:48.112414', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-696958466', 'name': 'instance-00000088', 'instance_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'instance_type': 'm1.nano', 'host': '1ccb54d27a90cfabba7749849227ab4a0a7e2ebf49cca99d56f9a5c1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b89f174-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6790.95192231, 'message_signature': '9b4db600599c4a878b9d35ffba2269b4a979d0a6919f2e007431472a9dac12cc'}]}, 'timestamp': '2025-11-29 07:28:48.113022', '_unique_id': 'c8b12a9f3a98430c89b303d0a1c6dba9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:28:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:48Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:7a:d3 10.100.0.14
Nov 29 02:28:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:28:48Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:7a:d3 10.100.0.14
Nov 29 02:28:50 np0005539504 nova_compute[187152]: 2025-11-29 07:28:50.806 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:51 np0005539504 nova_compute[187152]: 2025-11-29 07:28:51.320 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 4.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:51 np0005539504 nova_compute[187152]: 2025-11-29 07:28:51.432 187156 INFO nova.scheduler.client.report [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Deleted allocations for instance e53352bf-9c18-47a3-887b-ec5808266bd1#033[00m
Nov 29 02:28:51 np0005539504 nova_compute[187152]: 2025-11-29 07:28:51.658 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:52 np0005539504 podman[239641]: 2025-11-29 07:28:52.723472237 +0000 UTC m=+0.061577701 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:28:52 np0005539504 podman[239643]: 2025-11-29 07:28:52.728350039 +0000 UTC m=+0.065993930 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:28:52 np0005539504 podman[239642]: 2025-11-29 07:28:52.73030562 +0000 UTC m=+0.068801804 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Nov 29 02:28:53 np0005539504 nova_compute[187152]: 2025-11-29 07:28:53.685 187156 DEBUG oslo_concurrency.lockutils [None req-09cf0226-787a-494a-8434-ba92377b3839 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "e53352bf-9c18-47a3-887b-ec5808266bd1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 15.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:54 np0005539504 nova_compute[187152]: 2025-11-29 07:28:54.634 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401319.6323993, e53352bf-9c18-47a3-887b-ec5808266bd1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:54 np0005539504 nova_compute[187152]: 2025-11-29 07:28:54.635 187156 INFO nova.compute.manager [-] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:28:55 np0005539504 nova_compute[187152]: 2025-11-29 07:28:55.649 187156 DEBUG nova.compute.manager [None req-25637287-26b4-4a1b-8a3e-e720d6f6aedd - - - - - -] [instance: e53352bf-9c18-47a3-887b-ec5808266bd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:55 np0005539504 nova_compute[187152]: 2025-11-29 07:28:55.809 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:56 np0005539504 nova_compute[187152]: 2025-11-29 07:28:56.660 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:56 np0005539504 podman[239700]: 2025-11-29 07:28:56.76100227 +0000 UTC m=+0.086913029 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:28:56 np0005539504 podman[239701]: 2025-11-29 07:28:56.849494071 +0000 UTC m=+0.165829873 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 02:28:58 np0005539504 nova_compute[187152]: 2025-11-29 07:28:58.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:00 np0005539504 nova_compute[187152]: 2025-11-29 07:29:00.811 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539504 nova_compute[187152]: 2025-11-29 07:29:01.663 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:03 np0005539504 podman[239751]: 2025-11-29 07:29:03.758131644 +0000 UTC m=+0.085951444 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:29:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:05Z|00553|binding|INFO|Releasing lport d36011d9-2f3d-4616-b3ba-40f6405df460 from this chassis (sb_readonly=0)
Nov 29 02:29:05 np0005539504 nova_compute[187152]: 2025-11-29 07:29:05.637 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:05 np0005539504 nova_compute[187152]: 2025-11-29 07:29:05.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:05 np0005539504 nova_compute[187152]: 2025-11-29 07:29:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.474 187156 INFO nova.compute.manager [None req-5d1ba056-5698-4485-b01b-3865df5afdaa 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Pausing#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.475 187156 DEBUG nova.objects.instance [None req-5d1ba056-5698-4485-b01b-3865df5afdaa 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'flavor' on Instance uuid 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.531 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401346.531427, 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.532 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.535 187156 DEBUG nova.compute.manager [None req-5d1ba056-5698-4485-b01b-3865df5afdaa 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.563 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.569 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.606 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 29 02:29:06 np0005539504 nova_compute[187152]: 2025-11-29 07:29:06.665 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.423 187156 INFO nova.compute.manager [None req-4975be2c-1b31-4258-ab65-0c8076c134d2 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Unpausing#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.424 187156 DEBUG nova.objects.instance [None req-4975be2c-1b31-4258-ab65-0c8076c134d2 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'flavor' on Instance uuid 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.473 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401347.4731345, 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.474 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:29:07 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.480 187156 DEBUG nova.virt.libvirt.guest [None req-4975be2c-1b31-4258-ab65-0c8076c134d2 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.481 187156 DEBUG nova.compute.manager [None req-4975be2c-1b31-4258-ab65-0c8076c134d2 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.492 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.497 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:07 np0005539504 nova_compute[187152]: 2025-11-29 07:29:07.555 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Nov 29 02:29:10 np0005539504 nova_compute[187152]: 2025-11-29 07:29:10.816 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:11 np0005539504 nova_compute[187152]: 2025-11-29 07:29:11.668 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:11 np0005539504 podman[239772]: 2025-11-29 07:29:11.756788022 +0000 UTC m=+0.091149383 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:29:11 np0005539504 nova_compute[187152]: 2025-11-29 07:29:11.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:12 np0005539504 nova_compute[187152]: 2025-11-29 07:29:12.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:12 np0005539504 nova_compute[187152]: 2025-11-29 07:29:12.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.760 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.763 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.764 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.765 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.765 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.778 187156 INFO nova.compute.manager [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Terminating instance#033[00m
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.790 187156 DEBUG nova.compute.manager [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:29:13 np0005539504 kernel: tap93283ced-75 (unregistering): left promiscuous mode
Nov 29 02:29:13 np0005539504 NetworkManager[55210]: <info>  [1764401353.8194] device (tap93283ced-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:13 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:13Z|00554|binding|INFO|Releasing lport 93283ced-759a-40ca-bc21-9afde2a8218f from this chassis (sb_readonly=0)
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.829 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:13 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:13Z|00555|binding|INFO|Setting lport 93283ced-759a-40ca-bc21-9afde2a8218f down in Southbound
Nov 29 02:29:13 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:13Z|00556|binding|INFO|Removing iface tap93283ced-75 ovn-installed in OVS
Nov 29 02:29:13 np0005539504 nova_compute[187152]: 2025-11-29 07:29:13.849 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:13.857 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:7a:d3 10.100.0.14'], port_security=['fa:16:3e:68:7a:d3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1e3a87bf-6c8e-413e-af10-e61e32ad5d7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008329a1-d4dc-4cfb-be68-95f658d9813d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b04b69be-f431-4979-89c6-4e231888644a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7e71bac-297c-4031-8579-254c834f5859, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=93283ced-759a-40ca-bc21-9afde2a8218f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:13.860 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 93283ced-759a-40ca-bc21-9afde2a8218f in datapath 008329a1-d4dc-4cfb-be68-95f658d9813d unbound from our chassis#033[00m
Nov 29 02:29:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:13.862 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 008329a1-d4dc-4cfb-be68-95f658d9813d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:13.865 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[addb1e62-d74e-4e81-9309-f53665cb45b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:13.866 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d namespace which is not needed anymore#033[00m
Nov 29 02:29:13 np0005539504 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000088.scope: Deactivated successfully.
Nov 29 02:29:13 np0005539504 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000088.scope: Consumed 15.368s CPU time.
Nov 29 02:29:13 np0005539504 systemd-machined[153423]: Machine qemu-71-instance-00000088 terminated.
Nov 29 02:29:14 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [NOTICE]   (239503) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:14 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [NOTICE]   (239503) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:14 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [WARNING]  (239503) : Exiting Master process...
Nov 29 02:29:14 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [ALERT]    (239503) : Current worker (239505) exited with code 143 (Terminated)
Nov 29 02:29:14 np0005539504 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[239499]: [WARNING]  (239503) : All workers exited. Exiting... (0)
Nov 29 02:29:14 np0005539504 systemd[1]: libpod-068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1.scope: Deactivated successfully.
Nov 29 02:29:14 np0005539504 podman[239820]: 2025-11-29 07:29:14.040280861 +0000 UTC m=+0.058601371 container died 068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.070 187156 INFO nova.virt.libvirt.driver [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Instance destroyed successfully.#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.071 187156 DEBUG nova.objects.instance [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'resources' on Instance uuid 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9d285e81b70b288fccf791228e02e7cd5bebf335101d655b9a1f6d98e0d25770-merged.mount: Deactivated successfully.
Nov 29 02:29:14 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.096 187156 DEBUG nova.virt.libvirt.vif [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-696958466',display_name='tempest-ServerRescueNegativeTestJSON-server-696958466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-696958466',id=136,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-t927j07o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:07Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=1e3a87bf-6c8e-413e-af10-e61e32ad5d7d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.097 187156 DEBUG nova.network.os_vif_util [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "93283ced-759a-40ca-bc21-9afde2a8218f", "address": "fa:16:3e:68:7a:d3", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93283ced-75", "ovs_interfaceid": "93283ced-759a-40ca-bc21-9afde2a8218f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.098 187156 DEBUG nova.network.os_vif_util [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.098 187156 DEBUG os_vif [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.101 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.102 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93283ced-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.112 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.116 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:29:14 np0005539504 podman[239820]: 2025-11-29 07:29:14.120934661 +0000 UTC m=+0.139255161 container cleanup 068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.121 187156 INFO os_vif [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:7a:d3,bridge_name='br-int',has_traffic_filtering=True,id=93283ced-759a-40ca-bc21-9afde2a8218f,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93283ced-75')#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.122 187156 INFO nova.virt.libvirt.driver [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Deleting instance files /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d_del#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.123 187156 INFO nova.virt.libvirt.driver [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Deletion of /var/lib/nova/instances/1e3a87bf-6c8e-413e-af10-e61e32ad5d7d_del complete#033[00m
Nov 29 02:29:14 np0005539504 systemd[1]: libpod-conmon-068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1.scope: Deactivated successfully.
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.186 187156 DEBUG nova.compute.manager [req-921f0d79-a91e-4a03-acf1-9a5d7991092e req-60c82bf5-0550-47da-9570-909cb1ee4c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-vif-unplugged-93283ced-759a-40ca-bc21-9afde2a8218f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.187 187156 DEBUG oslo_concurrency.lockutils [req-921f0d79-a91e-4a03-acf1-9a5d7991092e req-60c82bf5-0550-47da-9570-909cb1ee4c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.187 187156 DEBUG oslo_concurrency.lockutils [req-921f0d79-a91e-4a03-acf1-9a5d7991092e req-60c82bf5-0550-47da-9570-909cb1ee4c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.187 187156 DEBUG oslo_concurrency.lockutils [req-921f0d79-a91e-4a03-acf1-9a5d7991092e req-60c82bf5-0550-47da-9570-909cb1ee4c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.187 187156 DEBUG nova.compute.manager [req-921f0d79-a91e-4a03-acf1-9a5d7991092e req-60c82bf5-0550-47da-9570-909cb1ee4c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] No waiting events found dispatching network-vif-unplugged-93283ced-759a-40ca-bc21-9afde2a8218f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.188 187156 DEBUG nova.compute.manager [req-921f0d79-a91e-4a03-acf1-9a5d7991092e req-60c82bf5-0550-47da-9570-909cb1ee4c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-vif-unplugged-93283ced-759a-40ca-bc21-9afde2a8218f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:29:14 np0005539504 podman[239865]: 2025-11-29 07:29:14.224879107 +0000 UTC m=+0.080301453 container remove 068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.232 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[34c57306-6de9-4553-89d9-b17e830a18c3]: (4, ('Sat Nov 29 07:29:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d (068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1)\n068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1\nSat Nov 29 07:29:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d (068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1)\n068211df916b3e12a8a0009e43d696975a047d316ebaf9810377c8f25a6a33f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.235 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[68914f30-5a2d-420a-81c4-beb25269336f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.236 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008329a1-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.239 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:14 np0005539504 kernel: tap008329a1-d0: left promiscuous mode
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.251 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.257 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ff66e9df-28cd-490c-ae9c-aead6c6a245f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.273 187156 INFO nova.compute.manager [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.274 187156 DEBUG oslo.service.loopingcall [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.274 187156 DEBUG nova.compute.manager [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.274 187156 DEBUG nova.network.neutron [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.285 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[be04fa9f-a696-46cb-b4c1-3dc5ffe93008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.287 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[021e492c-389b-4f94-9c7f-a6552cd749a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.306 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e1386a-9a99-4628-bf74-a78c06bac63d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 677520, 'reachable_time': 18516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239880, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.309 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:14.309 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[f827a9a7-602c-4c60-a7cf-8fe07b1355e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:14 np0005539504 systemd[1]: run-netns-ovnmeta\x2d008329a1\x2dd4dc\x2d4cfb\x2dbe68\x2d95f658d9813d.mount: Deactivated successfully.
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.918 187156 DEBUG nova.network.neutron [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:14 np0005539504 nova_compute[187152]: 2025-11-29 07:29:14.963 187156 INFO nova.compute.manager [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Took 0.69 seconds to deallocate network for instance.#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.070 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.071 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.166 187156 DEBUG nova.compute.provider_tree [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.185 187156 DEBUG nova.scheduler.client.report [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.243 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.301 187156 INFO nova.scheduler.client.report [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Deleted allocations for instance 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d#033[00m
Nov 29 02:29:15 np0005539504 nova_compute[187152]: 2025-11-29 07:29:15.455 187156 DEBUG oslo_concurrency.lockutils [None req-de78a2ca-48d1-489d-8720-ce865464543d 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.477 187156 DEBUG nova.compute.manager [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.478 187156 DEBUG oslo_concurrency.lockutils [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.479 187156 DEBUG oslo_concurrency.lockutils [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.479 187156 DEBUG oslo_concurrency.lockutils [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1e3a87bf-6c8e-413e-af10-e61e32ad5d7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.479 187156 DEBUG nova.compute.manager [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] No waiting events found dispatching network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.479 187156 WARNING nova.compute.manager [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received unexpected event network-vif-plugged-93283ced-759a-40ca-bc21-9afde2a8218f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.480 187156 DEBUG nova.compute.manager [req-21827fde-b863-47bd-8e82-8bcc5b74e1e0 req-3e39c640-62bd-448f-a711-f5986df31c3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Received event network-vif-deleted-93283ced-759a-40ca-bc21-9afde2a8218f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.670 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:16 np0005539504 nova_compute[187152]: 2025-11-29 07:29:16.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:18 np0005539504 nova_compute[187152]: 2025-11-29 07:29:18.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:18 np0005539504 nova_compute[187152]: 2025-11-29 07:29:18.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:18 np0005539504 nova_compute[187152]: 2025-11-29 07:29:18.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:18 np0005539504 nova_compute[187152]: 2025-11-29 07:29:18.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:18 np0005539504 nova_compute[187152]: 2025-11-29 07:29:18.962 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.149 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.191 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.192 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5726MB free_disk=73.07525253295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.193 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.193 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.375 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.376 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.405 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.434 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.482 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.483 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:19 np0005539504 nova_compute[187152]: 2025-11-29 07:29:19.957 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:21 np0005539504 nova_compute[187152]: 2025-11-29 07:29:21.485 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:21 np0005539504 nova_compute[187152]: 2025-11-29 07:29:21.485 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:29:21 np0005539504 nova_compute[187152]: 2025-11-29 07:29:21.485 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:29:21 np0005539504 nova_compute[187152]: 2025-11-29 07:29:21.749 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:21 np0005539504 nova_compute[187152]: 2025-11-29 07:29:21.915 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:29:21 np0005539504 nova_compute[187152]: 2025-11-29 07:29:21.916 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:22 np0005539504 nova_compute[187152]: 2025-11-29 07:29:22.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:22.973 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:22.974 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:22.974 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:23 np0005539504 podman[239883]: 2025-11-29 07:29:23.731167906 +0000 UTC m=+0.053706760 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Nov 29 02:29:23 np0005539504 podman[239882]: 2025-11-29 07:29:23.733280503 +0000 UTC m=+0.055389135 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:29:23 np0005539504 podman[239884]: 2025-11-29 07:29:23.735662246 +0000 UTC m=+0.054112860 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:29:24 np0005539504 nova_compute[187152]: 2025-11-29 07:29:24.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:26 np0005539504 nova_compute[187152]: 2025-11-29 07:29:26.752 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.323 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.323 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.347 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.506 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.506 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.516 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.516 187156 INFO nova.compute.claims [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.653 187156 DEBUG nova.compute.provider_tree [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.669 187156 DEBUG nova.scheduler.client.report [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.694 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.714 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "d04802e6-85a4-4282-8dc8-c13647dbba39" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.715 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "d04802e6-85a4-4282-8dc8-c13647dbba39" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539504 podman[239946]: 2025-11-29 07:29:27.72353913 +0000 UTC m=+0.067530051 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.727 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "d04802e6-85a4-4282-8dc8-c13647dbba39" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.728 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:29:27 np0005539504 podman[239947]: 2025-11-29 07:29:27.76126009 +0000 UTC m=+0.099954279 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.784 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.784 187156 DEBUG nova.network.neutron [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.803 187156 INFO nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.825 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.947 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.948 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.949 187156 INFO nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Creating image(s)#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.950 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "/var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.950 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "/var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.951 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "/var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:27 np0005539504 nova_compute[187152]: 2025-11-29 07:29:27.972 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.036 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.037 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.038 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.051 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.083 187156 DEBUG nova.policy [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6976495921c14f5f82332b6fd3ef8fcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8493b13e5d61430ea59091fbc6b5814f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.118 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.119 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.160 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.161 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.162 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.219 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.220 187156 DEBUG nova.virt.disk.api [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Checking if we can resize image /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.221 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.280 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.282 187156 DEBUG nova.virt.disk.api [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Cannot resize image /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.282 187156 DEBUG nova.objects.instance [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lazy-loading 'migration_context' on Instance uuid 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.301 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.302 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Ensure instance console log exists: /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.302 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.303 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:28 np0005539504 nova_compute[187152]: 2025-11-29 07:29:28.303 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:29 np0005539504 nova_compute[187152]: 2025-11-29 07:29:29.070 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401354.068435, 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:29 np0005539504 nova_compute[187152]: 2025-11-29 07:29:29.071 187156 INFO nova.compute.manager [-] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:29:29 np0005539504 nova_compute[187152]: 2025-11-29 07:29:29.168 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539504 nova_compute[187152]: 2025-11-29 07:29:29.711 187156 DEBUG nova.compute.manager [None req-28da80dc-d1b8-444d-a83e-c6cbe4cf331d - - - - - -] [instance: 1e3a87bf-6c8e-413e-af10-e61e32ad5d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:30 np0005539504 nova_compute[187152]: 2025-11-29 07:29:30.011 187156 DEBUG nova.network.neutron [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Successfully created port: 1781f543-53ed-4a89-9d69-6817988a5a84 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:29:31 np0005539504 nova_compute[187152]: 2025-11-29 07:29:31.754 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:32 np0005539504 nova_compute[187152]: 2025-11-29 07:29:32.819 187156 DEBUG nova.network.neutron [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Successfully updated port: 1781f543-53ed-4a89-9d69-6817988a5a84 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:29:32 np0005539504 nova_compute[187152]: 2025-11-29 07:29:32.849 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "refresh_cache-3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:32 np0005539504 nova_compute[187152]: 2025-11-29 07:29:32.850 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquired lock "refresh_cache-3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:32 np0005539504 nova_compute[187152]: 2025-11-29 07:29:32.850 187156 DEBUG nova.network.neutron [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:29:33 np0005539504 nova_compute[187152]: 2025-11-29 07:29:33.075 187156 DEBUG nova.network.neutron [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.172 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.197 187156 DEBUG nova.network.neutron [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Updating instance_info_cache with network_info: [{"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.218 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Releasing lock "refresh_cache-3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.219 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Instance network_info: |[{"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.223 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Start _get_guest_xml network_info=[{"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.231 187156 WARNING nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.237 187156 DEBUG nova.virt.libvirt.host [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.238 187156 DEBUG nova.virt.libvirt.host [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.242 187156 DEBUG nova.virt.libvirt.host [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.243 187156 DEBUG nova.virt.libvirt.host [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.245 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.245 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.246 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.246 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.247 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.247 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.247 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.247 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.248 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.248 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.248 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.248 187156 DEBUG nova.virt.hardware [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.255 187156 DEBUG nova.virt.libvirt.vif [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1098463195',display_name='tempest-ServerGroupTestJSON-server-1098463195',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1098463195',id=139,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8493b13e5d61430ea59091fbc6b5814f',ramdisk_id='',reservation_id='r-94w6esky',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-882517753',owner_user_name='tempest-ServerGroupTestJSON-882517753-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:27Z,user_data=None,user_id='6976495921c14f5f82332b6fd3ef8fcc',uuid=3bcd5107-a6bc-41b5-bea2-c3abdc5507cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.256 187156 DEBUG nova.network.os_vif_util [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Converting VIF {"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.257 187156 DEBUG nova.network.os_vif_util [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.258 187156 DEBUG nova.objects.instance [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.273 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <uuid>3bcd5107-a6bc-41b5-bea2-c3abdc5507cb</uuid>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <name>instance-0000008b</name>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerGroupTestJSON-server-1098463195</nova:name>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:29:34</nova:creationTime>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:user uuid="6976495921c14f5f82332b6fd3ef8fcc">tempest-ServerGroupTestJSON-882517753-project-member</nova:user>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:project uuid="8493b13e5d61430ea59091fbc6b5814f">tempest-ServerGroupTestJSON-882517753</nova:project>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        <nova:port uuid="1781f543-53ed-4a89-9d69-6817988a5a84">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <entry name="serial">3bcd5107-a6bc-41b5-bea2-c3abdc5507cb</entry>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <entry name="uuid">3bcd5107-a6bc-41b5-bea2-c3abdc5507cb</entry>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.config"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:da:60:5b"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <target dev="tap1781f543-53"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/console.log" append="off"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:29:34 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:29:34 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:29:34 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:29:34 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.274 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Preparing to wait for external event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.274 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.275 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.275 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.276 187156 DEBUG nova.virt.libvirt.vif [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1098463195',display_name='tempest-ServerGroupTestJSON-server-1098463195',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1098463195',id=139,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8493b13e5d61430ea59091fbc6b5814f',ramdisk_id='',reservation_id='r-94w6esky',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-882517753',owner_user_name='tempest-ServerGroupTestJSON-882517753-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:27Z,user_data=None,user_id='6976495921c14f5f82332b6fd3ef8fcc',uuid=3bcd5107-a6bc-41b5-bea2-c3abdc5507cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.276 187156 DEBUG nova.network.os_vif_util [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Converting VIF {"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.277 187156 DEBUG nova.network.os_vif_util [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.277 187156 DEBUG os_vif [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.277 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.278 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.278 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.283 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.283 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1781f543-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.284 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1781f543-53, col_values=(('external_ids', {'iface-id': '1781f543-53ed-4a89-9d69-6817988a5a84', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:60:5b', 'vm-uuid': '3bcd5107-a6bc-41b5-bea2-c3abdc5507cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:34 np0005539504 NetworkManager[55210]: <info>  [1764401374.2880] manager: (tap1781f543-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.293 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.294 187156 INFO os_vif [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53')#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.349 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.349 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.349 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] No VIF found with MAC fa:16:3e:da:60:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.350 187156 INFO nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Using config drive#033[00m
Nov 29 02:29:34 np0005539504 podman[240012]: 2025-11-29 07:29:34.738643467 +0000 UTC m=+0.078096964 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.759 187156 DEBUG nova.compute.manager [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-changed-1781f543-53ed-4a89-9d69-6817988a5a84 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.760 187156 DEBUG nova.compute.manager [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Refreshing instance network info cache due to event network-changed-1781f543-53ed-4a89-9d69-6817988a5a84. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.760 187156 DEBUG oslo_concurrency.lockutils [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.761 187156 DEBUG oslo_concurrency.lockutils [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.761 187156 DEBUG nova.network.neutron [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Refreshing network info cache for port 1781f543-53ed-4a89-9d69-6817988a5a84 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.891 187156 INFO nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Creating config drive at /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.config#033[00m
Nov 29 02:29:34 np0005539504 nova_compute[187152]: 2025-11-29 07:29:34.896 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr7r4bmq1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.023 187156 DEBUG oslo_concurrency.processutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr7r4bmq1" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:35 np0005539504 kernel: tap1781f543-53: entered promiscuous mode
Nov 29 02:29:35 np0005539504 NetworkManager[55210]: <info>  [1764401375.0818] manager: (tap1781f543-53): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.082 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:35Z|00557|binding|INFO|Claiming lport 1781f543-53ed-4a89-9d69-6817988a5a84 for this chassis.
Nov 29 02:29:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:35Z|00558|binding|INFO|1781f543-53ed-4a89-9d69-6817988a5a84: Claiming fa:16:3e:da:60:5b 10.100.0.4
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.101 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:60:5b 10.100.0.4'], port_security=['fa:16:3e:da:60:5b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3bcd5107-a6bc-41b5-bea2-c3abdc5507cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ae8d35-3748-44bc-8140-523e801435dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8493b13e5d61430ea59091fbc6b5814f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55ebce39-d278-4997-a90b-965ca656b9fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8ee8c0-a252-44c3-b883-bc22d860b169, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1781f543-53ed-4a89-9d69-6817988a5a84) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.102 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1781f543-53ed-4a89-9d69-6817988a5a84 in datapath d7ae8d35-3748-44bc-8140-523e801435dc bound to our chassis#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.104 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ae8d35-3748-44bc-8140-523e801435dc#033[00m
Nov 29 02:29:35 np0005539504 systemd-udevd[240050]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:35 np0005539504 systemd-machined[153423]: New machine qemu-72-instance-0000008b.
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.116 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f07f8f30-87d0-4e82-8581-c4993556c4e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.117 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7ae8d35-31 in ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.119 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7ae8d35-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.119 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d67532ef-a6fb-42fb-846c-cb022e19fbb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.120 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d74c2ea5-7e1a-4b23-a696-fc1effbd6479]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 NetworkManager[55210]: <info>  [1764401375.1264] device (tap1781f543-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:29:35 np0005539504 NetworkManager[55210]: <info>  [1764401375.1274] device (tap1781f543-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.132 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9173ae-1ea8-4c2a-9bac-d2fde0785af9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:35Z|00559|binding|INFO|Setting lport 1781f543-53ed-4a89-9d69-6817988a5a84 ovn-installed in OVS
Nov 29 02:29:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:35Z|00560|binding|INFO|Setting lport 1781f543-53ed-4a89-9d69-6817988a5a84 up in Southbound
Nov 29 02:29:35 np0005539504 systemd[1]: Started Virtual Machine qemu-72-instance-0000008b.
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.206 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.219 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd3b28b-adc7-4fa1-8f04-e68849c6d240]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.254 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5487f650-d2fc-4e51-b9f1-896dc89ff505]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.264 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[54a5f517-8e0b-4eed-8e64-5e7d0b91e9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 NetworkManager[55210]: <info>  [1764401375.2660] manager: (tapd7ae8d35-30): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Nov 29 02:29:35 np0005539504 systemd-udevd[240053]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.301 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c65338-e013-49b3-8624-90de0613951c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.305 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ab38c1ce-3663-4010-93c6-394c79d01b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 NetworkManager[55210]: <info>  [1764401375.3297] device (tapd7ae8d35-30): carrier: link connected
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.337 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4add4d52-e35a-4964-acb0-c2c7b7aa7bd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.359 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5b06fbb0-7618-4c57-980d-8ce9c39fcd59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ae8d35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:f6:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683820, 'reachable_time': 33398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240082, 'error': None, 'target': 'ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.381 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[040ad831-a388-4732-976c-a61f3d68eb4c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:f623'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 683820, 'tstamp': 683820}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240083, 'error': None, 'target': 'ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.403 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[664e500e-c0a8-4cad-a8c8-1066fda45c3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ae8d35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:f6:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683820, 'reachable_time': 33398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240084, 'error': None, 'target': 'ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.439 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a25ac785-0659-4da7-9850-d1b61108f66e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.511 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aec5764d-dd29-4941-8d97-6d10d78ec9d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.513 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ae8d35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.513 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.514 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ae8d35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:35 np0005539504 kernel: tapd7ae8d35-30: entered promiscuous mode
Nov 29 02:29:35 np0005539504 NetworkManager[55210]: <info>  [1764401375.5166] manager: (tapd7ae8d35-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.516 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.519 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.521 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ae8d35-30, col_values=(('external_ids', {'iface-id': '89567229-7de0-427f-8e87-de439621a70c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.522 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:35Z|00561|binding|INFO|Releasing lport 89567229-7de0-427f-8e87-de439621a70c from this chassis (sb_readonly=0)
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.550 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.552 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ae8d35-3748-44bc-8140-523e801435dc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ae8d35-3748-44bc-8140-523e801435dc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.553 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2872082f-b2db-421f-9b98-6bdcc1aa9508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.555 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-d7ae8d35-3748-44bc-8140-523e801435dc
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/d7ae8d35-3748-44bc-8140-523e801435dc.pid.haproxy
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID d7ae8d35-3748-44bc-8140-523e801435dc
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:29:35 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:35.556 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc', 'env', 'PROCESS_TAG=haproxy-d7ae8d35-3748-44bc-8140-523e801435dc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7ae8d35-3748-44bc-8140-523e801435dc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.769 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401375.7687478, 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.770 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] VM Started (Lifecycle Event)#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.791 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.797 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401375.7702084, 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.798 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.816 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.825 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:35 np0005539504 nova_compute[187152]: 2025-11-29 07:29:35.862 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:29:35 np0005539504 podman[240123]: 2025-11-29 07:29:35.967560552 +0000 UTC m=+0.063651017 container create 2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:29:36 np0005539504 systemd[1]: Started libpod-conmon-2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1.scope.
Nov 29 02:29:36 np0005539504 podman[240123]: 2025-11-29 07:29:35.932416431 +0000 UTC m=+0.028506926 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:29:36 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:29:36 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c1f5e3fcd34744b75588674f5683fb06f85cd563f7448cb1527a1e2c9c6610b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:29:36 np0005539504 podman[240123]: 2025-11-29 07:29:36.070656724 +0000 UTC m=+0.166747219 container init 2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:29:36 np0005539504 podman[240123]: 2025-11-29 07:29:36.081626518 +0000 UTC m=+0.177716983 container start 2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:29:36 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [NOTICE]   (240142) : New worker (240144) forked
Nov 29 02:29:36 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [NOTICE]   (240142) : Loading success.
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.629 187156 DEBUG nova.network.neutron [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Updated VIF entry in instance network info cache for port 1781f543-53ed-4a89-9d69-6817988a5a84. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.630 187156 DEBUG nova.network.neutron [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Updating instance_info_cache with network_info: [{"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.648 187156 DEBUG oslo_concurrency.lockutils [req-8174442b-4c7c-417d-9ab6-7835886045c6 req-b4a8b8b6-9c52-404d-bcb2-0c37c1ff2faa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.757 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.868 187156 DEBUG nova.compute.manager [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.869 187156 DEBUG oslo_concurrency.lockutils [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.869 187156 DEBUG oslo_concurrency.lockutils [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.869 187156 DEBUG oslo_concurrency.lockutils [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.870 187156 DEBUG nova.compute.manager [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Processing event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.870 187156 DEBUG nova.compute.manager [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.870 187156 DEBUG oslo_concurrency.lockutils [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.870 187156 DEBUG oslo_concurrency.lockutils [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.871 187156 DEBUG oslo_concurrency.lockutils [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.871 187156 DEBUG nova.compute.manager [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] No waiting events found dispatching network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.871 187156 WARNING nova.compute.manager [req-77a66f56-0c9b-4848-9734-4f2796450e81 req-f8844fdc-f26b-4b5f-b562-a9feca92eee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received unexpected event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.872 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.876 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401376.875575, 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.877 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.880 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.888 187156 INFO nova.virt.libvirt.driver [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Instance spawned successfully.#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.890 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.907 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.913 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.932 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.933 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.934 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.935 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.935 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.936 187156 DEBUG nova.virt.libvirt.driver [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:36 np0005539504 nova_compute[187152]: 2025-11-29 07:29:36.941 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:29:37 np0005539504 nova_compute[187152]: 2025-11-29 07:29:37.012 187156 INFO nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Took 9.06 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:29:37 np0005539504 nova_compute[187152]: 2025-11-29 07:29:37.013 187156 DEBUG nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:37 np0005539504 nova_compute[187152]: 2025-11-29 07:29:37.117 187156 INFO nova.compute.manager [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Took 9.68 seconds to build instance.#033[00m
Nov 29 02:29:37 np0005539504 nova_compute[187152]: 2025-11-29 07:29:37.134 187156 DEBUG oslo_concurrency.lockutils [None req-27fcb952-c178-4cc9-9363-6d95b74fe98a 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:39 np0005539504 nova_compute[187152]: 2025-11-29 07:29:39.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.167 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.168 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.168 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.168 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.169 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.180 187156 INFO nova.compute.manager [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Terminating instance#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.191 187156 DEBUG nova.compute.manager [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:29:41 np0005539504 kernel: tap1781f543-53 (unregistering): left promiscuous mode
Nov 29 02:29:41 np0005539504 NetworkManager[55210]: <info>  [1764401381.2156] device (tap1781f543-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.222 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:41Z|00562|binding|INFO|Releasing lport 1781f543-53ed-4a89-9d69-6817988a5a84 from this chassis (sb_readonly=0)
Nov 29 02:29:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:41Z|00563|binding|INFO|Setting lport 1781f543-53ed-4a89-9d69-6817988a5a84 down in Southbound
Nov 29 02:29:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:41Z|00564|binding|INFO|Removing iface tap1781f543-53 ovn-installed in OVS
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.231 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:60:5b 10.100.0.4'], port_security=['fa:16:3e:da:60:5b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3bcd5107-a6bc-41b5-bea2-c3abdc5507cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ae8d35-3748-44bc-8140-523e801435dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8493b13e5d61430ea59091fbc6b5814f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55ebce39-d278-4997-a90b-965ca656b9fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8ee8c0-a252-44c3-b883-bc22d860b169, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1781f543-53ed-4a89-9d69-6817988a5a84) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.234 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1781f543-53ed-4a89-9d69-6817988a5a84 in datapath d7ae8d35-3748-44bc-8140-523e801435dc unbound from our chassis#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.236 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7ae8d35-3748-44bc-8140-523e801435dc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.237 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c75d41d0-f68e-45b8-8363-0f1d247c8f39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.238 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc namespace which is not needed anymore#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.244 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Nov 29 02:29:41 np0005539504 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000008b.scope: Consumed 4.985s CPU time.
Nov 29 02:29:41 np0005539504 systemd-machined[153423]: Machine qemu-72-instance-0000008b terminated.
Nov 29 02:29:41 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [NOTICE]   (240142) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:41 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [NOTICE]   (240142) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:41 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [WARNING]  (240142) : Exiting Master process...
Nov 29 02:29:41 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [ALERT]    (240142) : Current worker (240144) exited with code 143 (Terminated)
Nov 29 02:29:41 np0005539504 neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc[240138]: [WARNING]  (240142) : All workers exited. Exiting... (0)
Nov 29 02:29:41 np0005539504 systemd[1]: libpod-2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1.scope: Deactivated successfully.
Nov 29 02:29:41 np0005539504 podman[240180]: 2025-11-29 07:29:41.379475237 +0000 UTC m=+0.048152932 container died 2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:29:41 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:41 np0005539504 systemd[1]: var-lib-containers-storage-overlay-6c1f5e3fcd34744b75588674f5683fb06f85cd563f7448cb1527a1e2c9c6610b-merged.mount: Deactivated successfully.
Nov 29 02:29:41 np0005539504 podman[240180]: 2025-11-29 07:29:41.417817294 +0000 UTC m=+0.086495009 container cleanup 2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.427 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 systemd[1]: libpod-conmon-2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1.scope: Deactivated successfully.
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.476 187156 INFO nova.virt.libvirt.driver [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Instance destroyed successfully.#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.479 187156 DEBUG nova.objects.instance [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lazy-loading 'resources' on Instance uuid 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.487 187156 DEBUG nova.compute.manager [req-f2390b7a-b2d8-43ad-b9cf-2f26ffa774eb req-b1cdf552-708b-4352-8120-27e54b2913e7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-vif-unplugged-1781f543-53ed-4a89-9d69-6817988a5a84 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.488 187156 DEBUG oslo_concurrency.lockutils [req-f2390b7a-b2d8-43ad-b9cf-2f26ffa774eb req-b1cdf552-708b-4352-8120-27e54b2913e7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.488 187156 DEBUG oslo_concurrency.lockutils [req-f2390b7a-b2d8-43ad-b9cf-2f26ffa774eb req-b1cdf552-708b-4352-8120-27e54b2913e7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.488 187156 DEBUG oslo_concurrency.lockutils [req-f2390b7a-b2d8-43ad-b9cf-2f26ffa774eb req-b1cdf552-708b-4352-8120-27e54b2913e7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.488 187156 DEBUG nova.compute.manager [req-f2390b7a-b2d8-43ad-b9cf-2f26ffa774eb req-b1cdf552-708b-4352-8120-27e54b2913e7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] No waiting events found dispatching network-vif-unplugged-1781f543-53ed-4a89-9d69-6817988a5a84 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.489 187156 DEBUG nova.compute.manager [req-f2390b7a-b2d8-43ad-b9cf-2f26ffa774eb req-b1cdf552-708b-4352-8120-27e54b2913e7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-vif-unplugged-1781f543-53ed-4a89-9d69-6817988a5a84 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:29:41 np0005539504 podman[240216]: 2025-11-29 07:29:41.505700518 +0000 UTC m=+0.054902771 container remove 2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.507 187156 DEBUG nova.virt.libvirt.vif [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:29:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-1098463195',display_name='tempest-ServerGroupTestJSON-server-1098463195',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-1098463195',id=139,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:29:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8493b13e5d61430ea59091fbc6b5814f',ramdisk_id='',reservation_id='r-94w6esky',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-882517753',owner_user_name='tempest-ServerGroupTestJSON-882517753-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:37Z,user_data=None,user_id='6976495921c14f5f82332b6fd3ef8fcc',uuid=3bcd5107-a6bc-41b5-bea2-c3abdc5507cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.508 187156 DEBUG nova.network.os_vif_util [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Converting VIF {"id": "1781f543-53ed-4a89-9d69-6817988a5a84", "address": "fa:16:3e:da:60:5b", "network": {"id": "d7ae8d35-3748-44bc-8140-523e801435dc", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1412053296-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8493b13e5d61430ea59091fbc6b5814f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1781f543-53", "ovs_interfaceid": "1781f543-53ed-4a89-9d69-6817988a5a84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.509 187156 DEBUG nova.network.os_vif_util [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.509 187156 DEBUG os_vif [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.512 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.512 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1781f543-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.512 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7d88dcf0-12ee-47a3-9787-490b58fcf543]: (4, ('Sat Nov 29 07:29:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc (2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1)\n2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1\nSat Nov 29 07:29:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc (2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1)\n2c7a96993e49affc78764eea6e73804d23979b902127cc25bbad423b799740e1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.516 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6ddc7389-af85-459c-ab42-985115e5f101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.518 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ae8d35-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.519 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 kernel: tapd7ae8d35-30: left promiscuous mode
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.524 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d83a56f5-55a7-4fbc-81f2-506ec7e49ad6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.527 187156 INFO os_vif [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:60:5b,bridge_name='br-int',has_traffic_filtering=True,id=1781f543-53ed-4a89-9d69-6817988a5a84,network=Network(d7ae8d35-3748-44bc-8140-523e801435dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1781f543-53')#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.528 187156 INFO nova.virt.libvirt.driver [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Deleting instance files /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb_del#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.530 187156 INFO nova.virt.libvirt.driver [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Deletion of /var/lib/nova/instances/3bcd5107-a6bc-41b5-bea2-c3abdc5507cb_del complete#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.536 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.548 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e473a13e-3c28-41bd-872e-cdb1b94f7f97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.549 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[001a77e6-3728-4a35-ac71-cbef5b5173ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.564 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e0f917-caaf-453a-81cc-a3b12cf0597b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683812, 'reachable_time': 21479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240242, 'error': None, 'target': 'ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 systemd[1]: run-netns-ovnmeta\x2dd7ae8d35\x2d3748\x2d44bc\x2d8140\x2d523e801435dc.mount: Deactivated successfully.
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.569 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7ae8d35-3748-44bc-8140-523e801435dc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:41.570 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[0497eae5-6d70-48e1-bb8a-fc678cf90e43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.634 187156 INFO nova.compute.manager [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.635 187156 DEBUG oslo.service.loopingcall [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.636 187156 DEBUG nova.compute.manager [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.636 187156 DEBUG nova.network.neutron [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.760 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.970 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.970 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:41 np0005539504 nova_compute[187152]: 2025-11-29 07:29:41.993 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.111 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.112 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.121 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.122 187156 INFO nova.compute.claims [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.298 187156 DEBUG nova.compute.provider_tree [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.316 187156 DEBUG nova.scheduler.client.report [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.338 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.339 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.461 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.461 187156 DEBUG nova.network.neutron [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.479 187156 INFO nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.498 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:29:42 np0005539504 podman[240243]: 2025-11-29 07:29:42.61726779 +0000 UTC m=+0.101792459 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.643 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.646 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.647 187156 INFO nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Creating image(s)#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.648 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.648 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.649 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.663 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.736 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.738 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.739 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.751 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.816 187156 DEBUG nova.network.neutron [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.820 187156 DEBUG nova.policy [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.846 187156 INFO nova.compute.manager [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Took 1.21 seconds to deallocate network for instance.#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.851 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.852 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.901 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.903 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.903 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.932 187156 DEBUG nova.compute.manager [req-563481bf-29d7-456f-b3b9-e2fc06998343 req-61c1ade4-52a9-4843-ab8e-d7549907fe03 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-vif-deleted-1781f543-53ed-4a89-9d69-6817988a5a84 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.978 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.979 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.983 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.984 187156 DEBUG nova.virt.disk.api [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:29:42 np0005539504 nova_compute[187152]: 2025-11-29 07:29:42.984 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.044 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.045 187156 DEBUG nova.virt.disk.api [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.045 187156 DEBUG nova.objects.instance [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 3835b666-929d-40c4-a556-3249ddef8b41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.065 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.065 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Ensure instance console log exists: /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.066 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.066 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.067 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.085 187156 DEBUG nova.compute.provider_tree [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.107 187156 DEBUG nova.scheduler.client.report [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.141 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.212 187156 INFO nova.scheduler.client.report [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Deleted allocations for instance 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.307 187156 DEBUG oslo_concurrency.lockutils [None req-4a9621d4-e358-46d0-b1c9-870936be0d65 6976495921c14f5f82332b6fd3ef8fcc 8493b13e5d61430ea59091fbc6b5814f - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.575 187156 DEBUG nova.compute.manager [req-7e422e7d-ef6d-475b-855f-bedfa80f9d38 req-cc4987f6-edce-4b64-a932-85c83c2433c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.576 187156 DEBUG oslo_concurrency.lockutils [req-7e422e7d-ef6d-475b-855f-bedfa80f9d38 req-cc4987f6-edce-4b64-a932-85c83c2433c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.576 187156 DEBUG oslo_concurrency.lockutils [req-7e422e7d-ef6d-475b-855f-bedfa80f9d38 req-cc4987f6-edce-4b64-a932-85c83c2433c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.577 187156 DEBUG oslo_concurrency.lockutils [req-7e422e7d-ef6d-475b-855f-bedfa80f9d38 req-cc4987f6-edce-4b64-a932-85c83c2433c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3bcd5107-a6bc-41b5-bea2-c3abdc5507cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.577 187156 DEBUG nova.compute.manager [req-7e422e7d-ef6d-475b-855f-bedfa80f9d38 req-cc4987f6-edce-4b64-a932-85c83c2433c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] No waiting events found dispatching network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.577 187156 WARNING nova.compute.manager [req-7e422e7d-ef6d-475b-855f-bedfa80f9d38 req-cc4987f6-edce-4b64-a932-85c83c2433c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Received unexpected event network-vif-plugged-1781f543-53ed-4a89-9d69-6817988a5a84 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:43 np0005539504 nova_compute[187152]: 2025-11-29 07:29:43.643 187156 DEBUG nova.network.neutron [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Successfully created port: b95ddcc9-0165-4e0c-aa88-981010149da0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.080 187156 DEBUG nova.network.neutron [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Successfully updated port: b95ddcc9-0165-4e0c-aa88-981010149da0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.103 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.103 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.104 187156 DEBUG nova.network.neutron [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.184 187156 DEBUG nova.compute.manager [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-changed-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.184 187156 DEBUG nova.compute.manager [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Refreshing instance network info cache due to event network-changed-b95ddcc9-0165-4e0c-aa88-981010149da0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.185 187156 DEBUG oslo_concurrency.lockutils [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.297 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:45.297 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:45.299 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:29:45 np0005539504 nova_compute[187152]: 2025-11-29 07:29:45.316 187156 DEBUG nova.network.neutron [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.098 187156 DEBUG nova.network.neutron [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.127 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.127 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Instance network_info: |[{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.128 187156 DEBUG oslo_concurrency.lockutils [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.128 187156 DEBUG nova.network.neutron [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Refreshing network info cache for port b95ddcc9-0165-4e0c-aa88-981010149da0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.131 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Start _get_guest_xml network_info=[{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.136 187156 WARNING nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.142 187156 DEBUG nova.virt.libvirt.host [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.143 187156 DEBUG nova.virt.libvirt.host [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.148 187156 DEBUG nova.virt.libvirt.host [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.149 187156 DEBUG nova.virt.libvirt.host [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.150 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.150 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.151 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.151 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.151 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.151 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.152 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.152 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.152 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.152 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.152 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.153 187156 DEBUG nova.virt.hardware [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.156 187156 DEBUG nova.virt.libvirt.vif [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-586434876',display_name='tempest-TestNetworkBasicOps-server-586434876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-586434876',id=141,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGhKjzGKzbNLkKn3Qe4bhA/1onvLYvMUS8bVLCIcvhhnAzCyX0uhE0akbpZ/Bj7R/OWR0vQKuaY/lmcBYYwUBbB1+I8iLsoBy9IQ2OcenTKB8q8Qhex8xJkGRj6S++f6sg==',key_name='tempest-TestNetworkBasicOps-2072959861',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-t688yedl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:42Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=3835b666-929d-40c4-a556-3249ddef8b41,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.157 187156 DEBUG nova.network.os_vif_util [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.157 187156 DEBUG nova.network.os_vif_util [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.158 187156 DEBUG nova.objects.instance [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3835b666-929d-40c4-a556-3249ddef8b41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.178 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <uuid>3835b666-929d-40c4-a556-3249ddef8b41</uuid>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <name>instance-0000008d</name>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkBasicOps-server-586434876</nova:name>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:29:46</nova:creationTime>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        <nova:port uuid="b95ddcc9-0165-4e0c-aa88-981010149da0">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <entry name="serial">3835b666-929d-40c4-a556-3249ddef8b41</entry>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <entry name="uuid">3835b666-929d-40c4-a556-3249ddef8b41</entry>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.config"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:0a:6a:49"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <target dev="tapb95ddcc9-01"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/console.log" append="off"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:29:46 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:29:46 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:29:46 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:29:46 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.180 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Preparing to wait for external event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.181 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.181 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.181 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.182 187156 DEBUG nova.virt.libvirt.vif [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-586434876',display_name='tempest-TestNetworkBasicOps-server-586434876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-586434876',id=141,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGhKjzGKzbNLkKn3Qe4bhA/1onvLYvMUS8bVLCIcvhhnAzCyX0uhE0akbpZ/Bj7R/OWR0vQKuaY/lmcBYYwUBbB1+I8iLsoBy9IQ2OcenTKB8q8Qhex8xJkGRj6S++f6sg==',key_name='tempest-TestNetworkBasicOps-2072959861',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-t688yedl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:42Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=3835b666-929d-40c4-a556-3249ddef8b41,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.183 187156 DEBUG nova.network.os_vif_util [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.183 187156 DEBUG nova.network.os_vif_util [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.184 187156 DEBUG os_vif [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.185 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.185 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.186 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.189 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.189 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb95ddcc9-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.190 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb95ddcc9-01, col_values=(('external_ids', {'iface-id': 'b95ddcc9-0165-4e0c-aa88-981010149da0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:6a:49', 'vm-uuid': '3835b666-929d-40c4-a556-3249ddef8b41'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.192 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:46 np0005539504 NetworkManager[55210]: <info>  [1764401386.1936] manager: (tapb95ddcc9-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.194 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.202 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.203 187156 INFO os_vif [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01')#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.268 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.268 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.268 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:0a:6a:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.269 187156 INFO nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Using config drive#033[00m
Nov 29 02:29:46 np0005539504 nova_compute[187152]: 2025-11-29 07:29:46.762 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.201 187156 INFO nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Creating config drive at /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.config#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.208 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpglv2lmfk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.302 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.345 187156 DEBUG oslo_concurrency.processutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpglv2lmfk" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:47 np0005539504 kernel: tapb95ddcc9-01: entered promiscuous mode
Nov 29 02:29:47 np0005539504 NetworkManager[55210]: <info>  [1764401387.4372] manager: (tapb95ddcc9-01): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Nov 29 02:29:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:47Z|00565|binding|INFO|Claiming lport b95ddcc9-0165-4e0c-aa88-981010149da0 for this chassis.
Nov 29 02:29:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:47Z|00566|binding|INFO|b95ddcc9-0165-4e0c-aa88-981010149da0: Claiming fa:16:3e:0a:6a:49 10.100.0.11
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.440 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.445 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.448 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 systemd-udevd[240303]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.469 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:6a:49 10.100.0.11'], port_security=['fa:16:3e:0a:6a:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '459d88e1-2dc5-49aa-b8cd-08ecb466d94a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=404f48ed-a51b-481c-9673-9d093f66b931, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b95ddcc9-0165-4e0c-aa88-981010149da0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.470 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b95ddcc9-0165-4e0c-aa88-981010149da0 in datapath b998d842-14b5-466c-99db-e8ccb7fefb1d bound to our chassis#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.472 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b998d842-14b5-466c-99db-e8ccb7fefb1d#033[00m
Nov 29 02:29:47 np0005539504 NetworkManager[55210]: <info>  [1764401387.4831] device (tapb95ddcc9-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.481 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac44d80-c4f8-487c-8e9d-0ee8c01142ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.482 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb998d842-11 in ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:29:47 np0005539504 NetworkManager[55210]: <info>  [1764401387.4844] device (tapb95ddcc9-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.485 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb998d842-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.485 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[096b1984-7828-4556-b02d-7be5542bff97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.487 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eda0d603-f65b-414c-9bce-c5524acc3a1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 systemd-machined[153423]: New machine qemu-73-instance-0000008d.
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.499 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[659b68a6-7174-46e0-9a37-fc1e5be759c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 systemd[1]: Started Virtual Machine qemu-73-instance-0000008d.
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.533 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.535 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4b25bba5-b10e-46c8-9830-06b720259be9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:47Z|00567|binding|INFO|Setting lport b95ddcc9-0165-4e0c-aa88-981010149da0 ovn-installed in OVS
Nov 29 02:29:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:47Z|00568|binding|INFO|Setting lport b95ddcc9-0165-4e0c-aa88-981010149da0 up in Southbound
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.540 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.578 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[0783de1e-b09c-40c4-a601-57199af958e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 NetworkManager[55210]: <info>  [1764401387.5861] manager: (tapb998d842-10): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.585 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[83a59067-252a-4f95-8ab0-65f9f2c10ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 systemd-udevd[240307]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.620 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2086abe0-1659-4666-8dcf-f03750275be7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.624 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce2f849-57fc-433d-84fb-1a094b599e96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 NetworkManager[55210]: <info>  [1764401387.6495] device (tapb998d842-10): carrier: link connected
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.653 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[27f7d63c-6d30-47ec-909a-ceaf8c647416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.677 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50d00391-00c1-4a64-9b57-aec06c7758e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb998d842-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:d4:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685052, 'reachable_time': 22612, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240337, 'error': None, 'target': 'ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.692 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5f91ad5e-6603-4689-81e3-bd63ef770973]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feab:d43c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 685052, 'tstamp': 685052}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240338, 'error': None, 'target': 'ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.711 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[53352632-40c3-4f5b-a7e9-e1af168cd5b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb998d842-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ab:d4:3c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685052, 'reachable_time': 22612, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240339, 'error': None, 'target': 'ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.746 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[554adfc2-3bb0-46c3-9234-e17b738f2662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.826 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a22aa4de-e10d-462d-a0ea-166aa0e233d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.829 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb998d842-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.829 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.830 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb998d842-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:47 np0005539504 kernel: tapb998d842-10: entered promiscuous mode
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.873 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 NetworkManager[55210]: <info>  [1764401387.8742] manager: (tapb998d842-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.876 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb998d842-10, col_values=(('external_ids', {'iface-id': 'a02c4bd5-fc81-485d-9566-811823118801'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:47Z|00569|binding|INFO|Releasing lport a02c4bd5-fc81-485d-9566-811823118801 from this chassis (sb_readonly=0)
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.917 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.918 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b998d842-14b5-466c-99db-e8ccb7fefb1d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b998d842-14b5-466c-99db-e8ccb7fefb1d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.919 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.919 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4ece60-b78a-4a08-a0f7-3d9c040dab89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.920 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-b998d842-14b5-466c-99db-e8ccb7fefb1d
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/b998d842-14b5-466c-99db-e8ccb7fefb1d.pid.haproxy
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID b998d842-14b5-466c-99db-e8ccb7fefb1d
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:29:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:29:47.921 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'env', 'PROCESS_TAG=haproxy-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b998d842-14b5-466c-99db-e8ccb7fefb1d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.928 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.937 187156 DEBUG nova.compute.manager [req-09eabe16-3e8c-4608-9611-da444635d6b1 req-978331dd-a667-4059-88da-cf8a03e96cf5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.938 187156 DEBUG oslo_concurrency.lockutils [req-09eabe16-3e8c-4608-9611-da444635d6b1 req-978331dd-a667-4059-88da-cf8a03e96cf5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.938 187156 DEBUG oslo_concurrency.lockutils [req-09eabe16-3e8c-4608-9611-da444635d6b1 req-978331dd-a667-4059-88da-cf8a03e96cf5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.938 187156 DEBUG oslo_concurrency.lockutils [req-09eabe16-3e8c-4608-9611-da444635d6b1 req-978331dd-a667-4059-88da-cf8a03e96cf5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.939 187156 DEBUG nova.compute.manager [req-09eabe16-3e8c-4608-9611-da444635d6b1 req-978331dd-a667-4059-88da-cf8a03e96cf5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Processing event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.994 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401387.993309, 3835b666-929d-40c4-a556-3249ddef8b41 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.995 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] VM Started (Lifecycle Event)#033[00m
Nov 29 02:29:47 np0005539504 nova_compute[187152]: 2025-11-29 07:29:47.999 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.013 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.019 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.020 187156 INFO nova.virt.libvirt.driver [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Instance spawned successfully.#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.021 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.029 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.056 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.056 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.057 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.057 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.058 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.059 187156 DEBUG nova.virt.libvirt.driver [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.063 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.063 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401387.9937387, 3835b666-929d-40c4-a556-3249ddef8b41 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.063 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.100 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.106 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401388.0128698, 3835b666-929d-40c4-a556-3249ddef8b41 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.106 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.131 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.135 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.152 187156 INFO nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Took 5.51 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.152 187156 DEBUG nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.161 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.269 187156 INFO nova.compute.manager [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Took 6.20 seconds to build instance.#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.290 187156 DEBUG oslo_concurrency.lockutils [None req-48a9494f-0513-4f04-9e37-2b6c9cd87c7b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.372 187156 DEBUG nova.network.neutron [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updated VIF entry in instance network info cache for port b95ddcc9-0165-4e0c-aa88-981010149da0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.372 187156 DEBUG nova.network.neutron [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:48 np0005539504 podman[240378]: 2025-11-29 07:29:48.373571922 +0000 UTC m=+0.074543458 container create 7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:48 np0005539504 nova_compute[187152]: 2025-11-29 07:29:48.388 187156 DEBUG oslo_concurrency.lockutils [req-7af99bf6-0f31-4cb1-bf1b-91a009013d91 req-dabb2092-e236-4a02-9e5b-fa6afba1c814 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:48 np0005539504 systemd[1]: Started libpod-conmon-7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3.scope.
Nov 29 02:29:48 np0005539504 podman[240378]: 2025-11-29 07:29:48.329289656 +0000 UTC m=+0.030261202 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:29:48 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:29:48 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed0a4709eb9caeb0a1ff342196593f7903c6c860abd842c4ed59ffa5aa21cd52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:29:48 np0005539504 podman[240378]: 2025-11-29 07:29:48.522729269 +0000 UTC m=+0.223700835 container init 7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:29:48 np0005539504 podman[240378]: 2025-11-29 07:29:48.530020254 +0000 UTC m=+0.230991830 container start 7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:29:48 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [NOTICE]   (240397) : New worker (240399) forked
Nov 29 02:29:48 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [NOTICE]   (240397) : Loading success.
Nov 29 02:29:50 np0005539504 nova_compute[187152]: 2025-11-29 07:29:50.099 187156 DEBUG nova.compute.manager [req-1acc833f-add6-43b1-9409-d17bc3dd3621 req-6acd8ce4-f110-4fc7-b2b0-531db0ed14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:50 np0005539504 nova_compute[187152]: 2025-11-29 07:29:50.100 187156 DEBUG oslo_concurrency.lockutils [req-1acc833f-add6-43b1-9409-d17bc3dd3621 req-6acd8ce4-f110-4fc7-b2b0-531db0ed14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:50 np0005539504 nova_compute[187152]: 2025-11-29 07:29:50.100 187156 DEBUG oslo_concurrency.lockutils [req-1acc833f-add6-43b1-9409-d17bc3dd3621 req-6acd8ce4-f110-4fc7-b2b0-531db0ed14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:50 np0005539504 nova_compute[187152]: 2025-11-29 07:29:50.100 187156 DEBUG oslo_concurrency.lockutils [req-1acc833f-add6-43b1-9409-d17bc3dd3621 req-6acd8ce4-f110-4fc7-b2b0-531db0ed14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:50 np0005539504 nova_compute[187152]: 2025-11-29 07:29:50.101 187156 DEBUG nova.compute.manager [req-1acc833f-add6-43b1-9409-d17bc3dd3621 req-6acd8ce4-f110-4fc7-b2b0-531db0ed14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] No waiting events found dispatching network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:50 np0005539504 nova_compute[187152]: 2025-11-29 07:29:50.101 187156 WARNING nova.compute.manager [req-1acc833f-add6-43b1-9409-d17bc3dd3621 req-6acd8ce4-f110-4fc7-b2b0-531db0ed14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received unexpected event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:29:51 np0005539504 NetworkManager[55210]: <info>  [1764401391.1123] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Nov 29 02:29:51 np0005539504 NetworkManager[55210]: <info>  [1764401391.1137] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Nov 29 02:29:51 np0005539504 nova_compute[187152]: 2025-11-29 07:29:51.111 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:51 np0005539504 nova_compute[187152]: 2025-11-29 07:29:51.193 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:51 np0005539504 nova_compute[187152]: 2025-11-29 07:29:51.225 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:29:51Z|00570|binding|INFO|Releasing lport a02c4bd5-fc81-485d-9566-811823118801 from this chassis (sb_readonly=0)
Nov 29 02:29:51 np0005539504 nova_compute[187152]: 2025-11-29 07:29:51.254 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:51 np0005539504 nova_compute[187152]: 2025-11-29 07:29:51.763 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:52 np0005539504 nova_compute[187152]: 2025-11-29 07:29:52.196 187156 DEBUG nova.compute.manager [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-changed-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:52 np0005539504 nova_compute[187152]: 2025-11-29 07:29:52.196 187156 DEBUG nova.compute.manager [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Refreshing instance network info cache due to event network-changed-b95ddcc9-0165-4e0c-aa88-981010149da0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:52 np0005539504 nova_compute[187152]: 2025-11-29 07:29:52.196 187156 DEBUG oslo_concurrency.lockutils [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:52 np0005539504 nova_compute[187152]: 2025-11-29 07:29:52.196 187156 DEBUG oslo_concurrency.lockutils [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:52 np0005539504 nova_compute[187152]: 2025-11-29 07:29:52.196 187156 DEBUG nova.network.neutron [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Refreshing network info cache for port b95ddcc9-0165-4e0c-aa88-981010149da0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:53 np0005539504 nova_compute[187152]: 2025-11-29 07:29:53.299 187156 DEBUG nova.network.neutron [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updated VIF entry in instance network info cache for port b95ddcc9-0165-4e0c-aa88-981010149da0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:53 np0005539504 nova_compute[187152]: 2025-11-29 07:29:53.301 187156 DEBUG nova.network.neutron [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:53 np0005539504 nova_compute[187152]: 2025-11-29 07:29:53.321 187156 DEBUG oslo_concurrency.lockutils [req-c5dc90ef-5e46-4d59-9a0e-6de293cc9a01 req-33344a7a-40f6-498b-8fec-52592382cee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:54 np0005539504 podman[240411]: 2025-11-29 07:29:54.743840622 +0000 UTC m=+0.064240772 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:29:54 np0005539504 podman[240410]: 2025-11-29 07:29:54.751773445 +0000 UTC m=+0.074884397 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:29:54 np0005539504 podman[240409]: 2025-11-29 07:29:54.77360596 +0000 UTC m=+0.100059912 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:29:55 np0005539504 nova_compute[187152]: 2025-11-29 07:29:55.289 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:56 np0005539504 nova_compute[187152]: 2025-11-29 07:29:56.196 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:56 np0005539504 nova_compute[187152]: 2025-11-29 07:29:56.475 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401381.4727244, 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:56 np0005539504 nova_compute[187152]: 2025-11-29 07:29:56.475 187156 INFO nova.compute.manager [-] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:29:56 np0005539504 nova_compute[187152]: 2025-11-29 07:29:56.503 187156 DEBUG nova.compute.manager [None req-e8bde25f-3be0-45b0-8b8a-0e1f59a4073b - - - - - -] [instance: 3bcd5107-a6bc-41b5-bea2-c3abdc5507cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:56 np0005539504 nova_compute[187152]: 2025-11-29 07:29:56.768 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:58 np0005539504 nova_compute[187152]: 2025-11-29 07:29:58.304 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:58 np0005539504 podman[240474]: 2025-11-29 07:29:58.73934283 +0000 UTC m=+0.062975799 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:29:58 np0005539504 podman[240475]: 2025-11-29 07:29:58.782338131 +0000 UTC m=+0.108911379 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:30:00 np0005539504 nova_compute[187152]: 2025-11-29 07:30:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:01 np0005539504 nova_compute[187152]: 2025-11-29 07:30:01.199 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:01 np0005539504 nova_compute[187152]: 2025-11-29 07:30:01.770 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:30:02Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:6a:49 10.100.0.11
Nov 29 02:30:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:30:02Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:6a:49 10.100.0.11
Nov 29 02:30:05 np0005539504 podman[240540]: 2025-11-29 07:30:05.767865887 +0000 UTC m=+0.096697042 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Nov 29 02:30:05 np0005539504 nova_compute[187152]: 2025-11-29 07:30:05.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:06 np0005539504 nova_compute[187152]: 2025-11-29 07:30:06.202 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:06 np0005539504 nova_compute[187152]: 2025-11-29 07:30:06.653 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:06 np0005539504 nova_compute[187152]: 2025-11-29 07:30:06.820 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:09 np0005539504 nova_compute[187152]: 2025-11-29 07:30:09.235 187156 INFO nova.compute.manager [None req-5c76bad7-7185-41cb-8813-1029adc305c6 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Get console output#033[00m
Nov 29 02:30:09 np0005539504 nova_compute[187152]: 2025-11-29 07:30:09.244 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:30:11 np0005539504 nova_compute[187152]: 2025-11-29 07:30:11.205 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:11 np0005539504 nova_compute[187152]: 2025-11-29 07:30:11.822 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:11 np0005539504 nova_compute[187152]: 2025-11-29 07:30:11.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:12 np0005539504 podman[240561]: 2025-11-29 07:30:12.760967446 +0000 UTC m=+0.082310577 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:30:12 np0005539504 nova_compute[187152]: 2025-11-29 07:30:12.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:12 np0005539504 nova_compute[187152]: 2025-11-29 07:30:12.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:30:13 np0005539504 ovn_controller[95182]: 2025-11-29T07:30:13Z|00571|binding|INFO|Releasing lport a02c4bd5-fc81-485d-9566-811823118801 from this chassis (sb_readonly=0)
Nov 29 02:30:13 np0005539504 nova_compute[187152]: 2025-11-29 07:30:13.750 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:15 np0005539504 nova_compute[187152]: 2025-11-29 07:30:15.358 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:16 np0005539504 nova_compute[187152]: 2025-11-29 07:30:16.208 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:16 np0005539504 nova_compute[187152]: 2025-11-29 07:30:16.824 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:16 np0005539504 nova_compute[187152]: 2025-11-29 07:30:16.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:18 np0005539504 nova_compute[187152]: 2025-11-29 07:30:18.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.073 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.074 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.075 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.075 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.254 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.315 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.316 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.372 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.547 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.548 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5545MB free_disk=73.0461196899414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.549 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.549 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.798 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 3835b666-929d-40c4-a556-3249ddef8b41 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.798 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.798 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.817 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.837 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.838 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.859 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.890 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.923 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:30:19 np0005539504 nova_compute[187152]: 2025-11-29 07:30:19.976 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:30:20 np0005539504 nova_compute[187152]: 2025-11-29 07:30:20.087 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:30:20 np0005539504 nova_compute[187152]: 2025-11-29 07:30:20.087 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:21 np0005539504 nova_compute[187152]: 2025-11-29 07:30:21.211 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:21 np0005539504 nova_compute[187152]: 2025-11-29 07:30:21.867 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:22 np0005539504 nova_compute[187152]: 2025-11-29 07:30:22.086 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:22 np0005539504 nova_compute[187152]: 2025-11-29 07:30:22.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:22 np0005539504 nova_compute[187152]: 2025-11-29 07:30:22.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:30:22 np0005539504 nova_compute[187152]: 2025-11-29 07:30:22.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:30:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:30:22.974 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:30:22.975 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:30:22.976 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:23 np0005539504 nova_compute[187152]: 2025-11-29 07:30:23.099 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:23 np0005539504 nova_compute[187152]: 2025-11-29 07:30:23.736 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:30:23 np0005539504 nova_compute[187152]: 2025-11-29 07:30:23.736 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:30:23 np0005539504 nova_compute[187152]: 2025-11-29 07:30:23.736 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:30:23 np0005539504 nova_compute[187152]: 2025-11-29 07:30:23.737 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3835b666-929d-40c4-a556-3249ddef8b41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:30:25 np0005539504 podman[240596]: 2025-11-29 07:30:25.578736635 +0000 UTC m=+0.090076384 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:30:25 np0005539504 podman[240594]: 2025-11-29 07:30:25.59272235 +0000 UTC m=+0.103821643 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:30:25 np0005539504 podman[240595]: 2025-11-29 07:30:25.605226734 +0000 UTC m=+0.111760154 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git)
Nov 29 02:30:26 np0005539504 nova_compute[187152]: 2025-11-29 07:30:26.214 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:26 np0005539504 nova_compute[187152]: 2025-11-29 07:30:26.871 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:27 np0005539504 ovn_controller[95182]: 2025-11-29T07:30:27Z|00572|binding|INFO|Releasing lport a02c4bd5-fc81-485d-9566-811823118801 from this chassis (sb_readonly=0)
Nov 29 02:30:27 np0005539504 nova_compute[187152]: 2025-11-29 07:30:27.379 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:27 np0005539504 nova_compute[187152]: 2025-11-29 07:30:27.807 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:30:27 np0005539504 nova_compute[187152]: 2025-11-29 07:30:27.860 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:30:27 np0005539504 nova_compute[187152]: 2025-11-29 07:30:27.861 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:30:27 np0005539504 nova_compute[187152]: 2025-11-29 07:30:27.862 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:29 np0005539504 nova_compute[187152]: 2025-11-29 07:30:29.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:29 np0005539504 podman[240655]: 2025-11-29 07:30:29.709906068 +0000 UTC m=+0.052000784 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:30:29 np0005539504 podman[240656]: 2025-11-29 07:30:29.75177501 +0000 UTC m=+0.088431080 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:30:30 np0005539504 nova_compute[187152]: 2025-11-29 07:30:30.856 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:31 np0005539504 nova_compute[187152]: 2025-11-29 07:30:31.217 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:31 np0005539504 nova_compute[187152]: 2025-11-29 07:30:31.874 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:36 np0005539504 nova_compute[187152]: 2025-11-29 07:30:36.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:36 np0005539504 podman[240701]: 2025-11-29 07:30:36.735453536 +0000 UTC m=+0.073933712 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:30:36 np0005539504 nova_compute[187152]: 2025-11-29 07:30:36.877 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539504 nova_compute[187152]: 2025-11-29 07:30:41.254 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539504 nova_compute[187152]: 2025-11-29 07:30:41.272 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539504 nova_compute[187152]: 2025-11-29 07:30:41.878 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:43 np0005539504 podman[240722]: 2025-11-29 07:30:43.724973086 +0000 UTC m=+0.072150665 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:30:46 np0005539504 nova_compute[187152]: 2025-11-29 07:30:46.275 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:46 np0005539504 nova_compute[187152]: 2025-11-29 07:30:46.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.980 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3835b666-929d-40c4-a556-3249ddef8b41', 'name': 'tempest-TestNetworkBasicOps-server-586434876', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.981 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.985 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3835b666-929d-40c4-a556-3249ddef8b41 / tapb95ddcc9-01 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.985 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8edae2a6-f98c-4958-9268-d2dee4653aa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:47.981463', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '52fd1f54-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': '663ea4232aea737e0f216953a7f3753bdfa09a33fa62661c60b954a8ffc2db36'}]}, 'timestamp': '2025-11-29 07:30:47.986497', '_unique_id': '44236951de0349ef8c7ed6ad0a77c917'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.988 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:47.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.005 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.005 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efd0ff2e-1678-4e41-a749-5c718e236b1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:47.989812', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '530026d6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.924494141, 'message_signature': '780502794bc6de9f3383527a2c90b0cdacc20df921b1c44758c2880911baee39'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:47.989812', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53003252-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.924494141, 'message_signature': 'f5f074fd15df0469dcd45c130ca4168d300cbcf91e6d36a2a8b700a5812e601a'}]}, 'timestamp': '2025-11-29 07:30:48.006247', '_unique_id': 'fb9fa058b5ce4a1682d8e50ed664c5a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.020 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "7e03c289-84af-4001-90ae-bb6067d68199" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.020 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "7e03c289-84af-4001-90ae-bb6067d68199" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.046 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.write.bytes volume: 73003008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.047 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78994cdb-69af-42eb-aba2-995dd8631b08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73003008, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.007804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53067040-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '2a4026aaf15efe12abb23e3805540613d887efcbb4a9644402f3acebabf622b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.007804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '530695f2-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '734164fd803e06896b47f3016c3ecaa1daf6f1542df433b767d61c6c23a0edd3'}]}, 'timestamp': '2025-11-29 07:30:48.048224', '_unique_id': 'f1cf029658104aaa93676ad5d8cd463f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.051 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.052 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a40b9ff-2712-4a1a-9010-7b4aa359a396', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.051911', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53073f16-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.924494141, 'message_signature': 'a7320822f2e0748960ac194e6af2af306f9d5bf512133007079a0f42cd3d6200'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.051911', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '530753de-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.924494141, 'message_signature': '2ecb82b4896bb5dae4ff0f70f6df45a65556a5db51d77be05ca050c4e4e5df64'}]}, 'timestamp': '2025-11-29 07:30:48.053056', '_unique_id': '3eb3436eb7554c16a05342cbbdb73ae5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.056 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.056 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba0cf185-d5ae-4259-8206-4ae71859797e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.056303', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '5307ec5e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': '132d72114f49cd42c835729313631a5e573a5189c24be11c500626585e9b99b1'}]}, 'timestamp': '2025-11-29 07:30:48.056991', '_unique_id': '2626c5b419d0415caf63c206bff73ce2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.060 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.read.latency volume: 222030278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.060 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.read.latency volume: 33301410 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d2fd657-c71a-46d9-88b9-050743988e86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 222030278, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.059993', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '530875e8-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '26e8d04038c65bd12386c65ee04e324f9d52ecaeddb7f4a9f953d5b66839d25f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33301410, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.059993', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53088fd8-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '06cc28f213adef90173740c585c9c571f6b81cba1d1e82ad4fb6b2a3faf83005'}]}, 'timestamp': '2025-11-29 07:30:48.061146', '_unique_id': 'ef6f27d6b3bd40969a0596bfbe039559'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.064 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.read.bytes volume: 31148544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.064 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7a1fd62-9d11-49d7-80b8-49f5e7bc3fec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31148544, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.064200', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53091d4a-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': 'ad46b0d2e742ade50754f6c5c5fc6e67ae45816b1734b1e01e4106d07188b517'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.064200', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53093582-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': 'b87f5c04939979129e3f9dc0917410401d4aa435668ce7a570f45a0aa2234b6e'}]}, 'timestamp': '2025-11-29 07:30:48.065437', '_unique_id': 'a72753cfc2454ad3bc4dc59375749d6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.068 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.097 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/memory.usage volume: 46.51171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e207414-12bc-4b22-9b22-f819578905b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.51171875, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'timestamp': '2025-11-29T07:30:48.068600', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '530e2830-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6911.031685382, 'message_signature': 'ff349cca58aa35090ab3ba66b589c2a70ed0fd9e2779fbe69b24476d8bf86040'}]}, 'timestamp': '2025-11-29 07:30:48.097874', '_unique_id': '72d01161303144ea9d84df590bc5b782'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.100 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.outgoing.packets volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87d6ddff-6fe5-46e2-aa10-f8c6c948302b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 110, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.100840', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '530eb322-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': 'bfa2dad0979ac7e643b029ad938eeaef1320f18ddab74c01b82c455b7bb5f85e'}]}, 'timestamp': '2025-11-29 07:30:48.101445', '_unique_id': '8b07f096bdd349beb25f7a84e6252f13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.104 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.104 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd858615c-8a77-4f1c-943d-3ac8096cad58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.104216', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '530f3798-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '9b4f6ebbf4a54e81e5e496933c0044280bd96d1388cbace2f74483c2cdb26e8a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.104216', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '530f4b66-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': 'ea346f61a7ccddaf218edaeb168ee209456fc027b758d933f3aff95540c334b6'}]}, 'timestamp': '2025-11-29 07:30:48.105348', '_unique_id': '134ec2af0bc14a7db086d613cac900f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.108 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.write.latency volume: 6618287183 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.109 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81c0a333-46ef-4bb6-9db3-a7d85e9219b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6618287183, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.108723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '530fe5bc-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '9c65e1d4ec19495ea15563e2963850e3d42ed89cd8db00b6ebfe4903f8ec89d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.108723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '530ffa2a-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '28706ece04767223294147aa08ba6a8a0cab911ade81bb92bcee5b855ad7b2d2'}]}, 'timestamp': '2025-11-29 07:30:48.109744', '_unique_id': '668e9dbc02644351bcffea89d5812b21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.112 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.read.requests volume: 1112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.113 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9de1b5d1-2801-4a0d-9847-78135ff13666', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1112, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.112635', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53107df6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '86f08222dc88896eb4b8bf16b7c8594f2cc8dd179acdbcc9ffcdbba55e7059e7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.112635', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53108fd0-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.942518904, 'message_signature': '37dff2861e191807d5f0d8bee0af7754bf409ee590ff7ead6047377083075f69'}]}, 'timestamp': '2025-11-29 07:30:48.113601', '_unique_id': '3972bc741e01409f9a31387aac9f82b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.116 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.116 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71e199a4-a99e-41b1-bbcb-b78050e80409', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-vda', 'timestamp': '2025-11-29T07:30:48.116319', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53110f96-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.924494141, 'message_signature': '5d019016ea8a7a602d966e2015bf3464ccbe4229da6000db662377dd492dfd87'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41-sda', 'timestamp': '2025-11-29T07:30:48.116319', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53111c8e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.924494141, 'message_signature': 'af9f7c95df8d56cae1a571addb2522c59e13fedf821b7159e15ef718684027e7'}]}, 'timestamp': '2025-11-29 07:30:48.117086', '_unique_id': 'c879d1e5d6f6424b98840fa3dea78abf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.118 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20a23aa9-1b67-41a8-8ea8-5bf847594af6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.118826', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '53116c20-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': '75c7388936a800fd4f8de463ddd426eff971208c7d5f4e98c8865216c7ed0090'}]}, 'timestamp': '2025-11-29 07:30:48.119142', '_unique_id': '0524d276e84e42b1a088f398b9b20698'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.120 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf4f9b1e-a99b-4ea4-ae77-2ac5c47872a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.120705', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '5311b55e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': '47fedb904843562e260680090c4eff91e5694e42bede5ab21e1ead761bca1a51'}]}, 'timestamp': '2025-11-29 07:30:48.121019', '_unique_id': '2e516d9cacbc4dcba62bc640cac0e328'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.122 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.incoming.bytes volume: 19386 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43a716f8-9541-4dc0-8280-6e8bc7c5b6b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19386, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.122580', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '5311feba-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': 'fa0cb04f88f3d2786f0bfdace2e9eb0a3702b7bd229f11b9c39dabc8e1ff977e'}]}, 'timestamp': '2025-11-29 07:30:48.122902', '_unique_id': '6fe903893d324483821cb2fc49afac35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.124 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.124 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>]
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>]
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>]
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.125 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.outgoing.bytes volume: 16100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c855977b-30ce-48df-a05a-ff54d5d1ed06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16100, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.125799', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '53127c96-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': '99e84ba8b6e5d54bcdbf31afb9ad3ebd25f47d7d31ca0a5a5e43781e1689938f'}]}, 'timestamp': '2025-11-29 07:30:48.126124', '_unique_id': 'c0e3de333f934b5597aadda201f42c2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.127 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/cpu volume: 12570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '899a0af4-5b5a-447d-907d-53a0469977c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12570000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'timestamp': '2025-11-29T07:30:48.127651', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'instance-0000008d', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5312c480-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6911.031685382, 'message_signature': '301a0d4eacf980759aaddc1d641aa02c44df41e4f3ac2181e080c926a7c6c884'}]}, 'timestamp': '2025-11-29 07:30:48.127982', '_unique_id': '4efd6719d2b84dc981d29b186408cbad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.129 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ab0a53e-e2f8-4aab-9ee5-6d74d8857149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.129518', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '53130df0-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': '79be43882dbf1f7183da877bccd8f3132ee15981bc716066acc3fc87c28a140d'}]}, 'timestamp': '2025-11-29 07:30:48.129841', '_unique_id': '0120b68dc3eb451797d8b79454d1bcc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.131 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.incoming.packets volume: 107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '429bd9dc-8b4c-4539-9fe8-187405c6babd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 107, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.131334', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '531355c6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': 'a2ba66f48fe88d80d85ec3ad73aa7ce84590870e22b6fd6bbd9935216e5da189'}]}, 'timestamp': '2025-11-29 07:30:48.131680', '_unique_id': 'ab6ea506cd8040d4823cf3ef25399fe4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.133 12 DEBUG ceilometer.compute.pollsters [-] 3835b666-929d-40c4-a556-3249ddef8b41/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38b117a4-22a1-40c9-9ce1-e1d613ebf618', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008d-3835b666-929d-40c4-a556-3249ddef8b41-tapb95ddcc9-01', 'timestamp': '2025-11-29T07:30:48.133514', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-586434876', 'name': 'tapb95ddcc9-01', 'instance_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:6a:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb95ddcc9-01'}, 'message_id': '5313aa1c-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 6910.916144737, 'message_signature': 'fac0ef23b6948c5463368014be8ee2f889f1a3145d5de67e1523d2502260fa2b'}]}, 'timestamp': '2025-11-29 07:30:48.133842', '_unique_id': 'd528ebed225a44ca84f2e74c0bc1f571'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.135 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:30:48.135 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-586434876>]
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.192 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.687 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "e3d9dd73-abec-4339-9a8e-2781397f0e22" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.687 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "e3d9dd73-abec-4339-9a8e-2781397f0e22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.714 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.730 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.730 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.737 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:30:48 np0005539504 nova_compute[187152]: 2025-11-29 07:30:48.737 187156 INFO nova.compute.claims [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:30:49 np0005539504 nova_compute[187152]: 2025-11-29 07:30:49.023 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:49 np0005539504 nova_compute[187152]: 2025-11-29 07:30:49.449 187156 DEBUG nova.compute.provider_tree [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:30:49 np0005539504 nova_compute[187152]: 2025-11-29 07:30:49.988 187156 DEBUG nova.scheduler.client.report [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:30:51 np0005539504 nova_compute[187152]: 2025-11-29 07:30:51.322 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:51 np0005539504 nova_compute[187152]: 2025-11-29 07:30:51.883 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:52 np0005539504 nova_compute[187152]: 2025-11-29 07:30:52.135 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:52 np0005539504 nova_compute[187152]: 2025-11-29 07:30:52.136 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:30:52 np0005539504 nova_compute[187152]: 2025-11-29 07:30:52.139 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 3.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:52 np0005539504 nova_compute[187152]: 2025-11-29 07:30:52.148 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:30:52 np0005539504 nova_compute[187152]: 2025-11-29 07:30:52.149 187156 INFO nova.compute.claims [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:30:55 np0005539504 podman[240743]: 2025-11-29 07:30:55.716593935 +0000 UTC m=+0.061951751 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:30:55 np0005539504 podman[240745]: 2025-11-29 07:30:55.720850539 +0000 UTC m=+0.058447606 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 29 02:30:55 np0005539504 podman[240744]: 2025-11-29 07:30:55.746576219 +0000 UTC m=+0.086423117 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Nov 29 02:30:56 np0005539504 nova_compute[187152]: 2025-11-29 07:30:56.327 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:30:56.828 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:56 np0005539504 nova_compute[187152]: 2025-11-29 07:30:56.828 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:30:56.829 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:30:56 np0005539504 nova_compute[187152]: 2025-11-29 07:30:56.886 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:57 np0005539504 nova_compute[187152]: 2025-11-29 07:30:57.568 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 29 02:31:00 np0005539504 podman[240809]: 2025-11-29 07:31:00.761654871 +0000 UTC m=+0.087890926 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:31:00 np0005539504 podman[240810]: 2025-11-29 07:31:00.799328959 +0000 UTC m=+0.119883972 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:31:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:00.831 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:31:01 np0005539504 nova_compute[187152]: 2025-11-29 07:31:01.329 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:01 np0005539504 nova_compute[187152]: 2025-11-29 07:31:01.888 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:01 np0005539504 nova_compute[187152]: 2025-11-29 07:31:01.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:06 np0005539504 nova_compute[187152]: 2025-11-29 07:31:06.332 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:06 np0005539504 nova_compute[187152]: 2025-11-29 07:31:06.892 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:06 np0005539504 nova_compute[187152]: 2025-11-29 07:31:06.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:07 np0005539504 podman[240867]: 2025-11-29 07:31:07.746549879 +0000 UTC m=+0.085427600 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:31:11 np0005539504 nova_compute[187152]: 2025-11-29 07:31:11.374 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:11 np0005539504 nova_compute[187152]: 2025-11-29 07:31:11.393 187156 INFO nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:31:11 np0005539504 nova_compute[187152]: 2025-11-29 07:31:11.895 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:12 np0005539504 nova_compute[187152]: 2025-11-29 07:31:12.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:13 np0005539504 nova_compute[187152]: 2025-11-29 07:31:13.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:13 np0005539504 nova_compute[187152]: 2025-11-29 07:31:13.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:31:14 np0005539504 nova_compute[187152]: 2025-11-29 07:31:14.127 187156 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.01 sec#033[00m
Nov 29 02:31:14 np0005539504 nova_compute[187152]: 2025-11-29 07:31:14.220 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:31:14 np0005539504 podman[240889]: 2025-11-29 07:31:14.732644009 +0000 UTC m=+0.071974290 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:31:14 np0005539504 nova_compute[187152]: 2025-11-29 07:31:14.789 187156 DEBUG nova.compute.provider_tree [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:31:16 np0005539504 nova_compute[187152]: 2025-11-29 07:31:16.193 187156 DEBUG nova.scheduler.client.report [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:31:16 np0005539504 nova_compute[187152]: 2025-11-29 07:31:16.377 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:16 np0005539504 nova_compute[187152]: 2025-11-29 07:31:16.897 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:16 np0005539504 nova_compute[187152]: 2025-11-29 07:31:16.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:19 np0005539504 ovn_controller[95182]: 2025-11-29T07:31:19Z|00573|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 02:31:19 np0005539504 nova_compute[187152]: 2025-11-29 07:31:19.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:20 np0005539504 nova_compute[187152]: 2025-11-29 07:31:20.222 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 28.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:20 np0005539504 nova_compute[187152]: 2025-11-29 07:31:20.223 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:31:21 np0005539504 nova_compute[187152]: 2025-11-29 07:31:21.380 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:21 np0005539504 nova_compute[187152]: 2025-11-29 07:31:21.560 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:21 np0005539504 nova_compute[187152]: 2025-11-29 07:31:21.561 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:21 np0005539504 nova_compute[187152]: 2025-11-29 07:31:21.561 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:21 np0005539504 nova_compute[187152]: 2025-11-29 07:31:21.562 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:31:21 np0005539504 nova_compute[187152]: 2025-11-29 07:31:21.900 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:22.976 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:22.977 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:22.977 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:24 np0005539504 nova_compute[187152]: 2025-11-29 07:31:24.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:26 np0005539504 nova_compute[187152]: 2025-11-29 07:31:26.383 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:26 np0005539504 podman[240912]: 2025-11-29 07:31:26.762123989 +0000 UTC m=+0.079341766 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:31:26 np0005539504 podman[240914]: 2025-11-29 07:31:26.774410438 +0000 UTC m=+0.083237281 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:31:26 np0005539504 podman[240913]: 2025-11-29 07:31:26.781064976 +0000 UTC m=+0.101805058 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:31:26 np0005539504 nova_compute[187152]: 2025-11-29 07:31:26.902 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:28 np0005539504 nova_compute[187152]: 2025-11-29 07:31:28.678 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:28 np0005539504 nova_compute[187152]: 2025-11-29 07:31:28.768 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:28 np0005539504 nova_compute[187152]: 2025-11-29 07:31:28.770 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:28 np0005539504 nova_compute[187152]: 2025-11-29 07:31:28.836 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:29 np0005539504 nova_compute[187152]: 2025-11-29 07:31:29.049 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:31:29 np0005539504 nova_compute[187152]: 2025-11-29 07:31:29.050 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5547MB free_disk=73.04607772827148GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:31:29 np0005539504 nova_compute[187152]: 2025-11-29 07:31:29.051 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:29 np0005539504 nova_compute[187152]: 2025-11-29 07:31:29.051 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:31 np0005539504 nova_compute[187152]: 2025-11-29 07:31:31.386 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:31 np0005539504 podman[240984]: 2025-11-29 07:31:31.720127464 +0000 UTC m=+0.058639542 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:31:31 np0005539504 podman[240985]: 2025-11-29 07:31:31.767325828 +0000 UTC m=+0.093020632 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:31:31 np0005539504 nova_compute[187152]: 2025-11-29 07:31:31.903 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:36 np0005539504 nova_compute[187152]: 2025-11-29 07:31:36.388 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:36 np0005539504 nova_compute[187152]: 2025-11-29 07:31:36.906 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:37 np0005539504 nova_compute[187152]: 2025-11-29 07:31:37.720 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Nov 29 02:31:38 np0005539504 podman[241036]: 2025-11-29 07:31:38.701185299 +0000 UTC m=+0.049470504 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125)
Nov 29 02:31:41 np0005539504 nova_compute[187152]: 2025-11-29 07:31:41.391 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:41 np0005539504 nova_compute[187152]: 2025-11-29 07:31:41.908 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:45 np0005539504 podman[241056]: 2025-11-29 07:31:45.788288042 +0000 UTC m=+0.117448200 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:31:46 np0005539504 nova_compute[187152]: 2025-11-29 07:31:46.393 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:46 np0005539504 nova_compute[187152]: 2025-11-29 07:31:46.909 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:51 np0005539504 nova_compute[187152]: 2025-11-29 07:31:51.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:51 np0005539504 nova_compute[187152]: 2025-11-29 07:31:51.911 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:31:54Z|00574|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Nov 29 02:31:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:54.836 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.836 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:54.837 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.878 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.882 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.882 187156 INFO nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Creating image(s)#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.884 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "/var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.885 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "/var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.886 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "/var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.912 187156 INFO nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.916 187156 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.79 sec#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.925 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.990 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 3835b666-929d-40c4-a556-3249ddef8b41 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.991 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 7e03c289-84af-4001-90ae-bb6067d68199 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.991 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance e3d9dd73-abec-4339-9a8e-2781397f0e22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.992 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:31:54 np0005539504 nova_compute[187152]: 2025-11-29 07:31:54.992 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.004 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.005 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.006 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.021 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.090 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.091 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.201 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.473 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.543 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk 1073741824" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.544 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.545 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.571 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.626 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.627 187156 DEBUG nova.virt.disk.api [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Checking if we can resize image /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.628 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.686 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.687 187156 DEBUG nova.virt.disk.api [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Cannot resize image /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:31:55 np0005539504 nova_compute[187152]: 2025-11-29 07:31:55.688 187156 DEBUG nova.objects.instance [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'migration_context' on Instance uuid 7e03c289-84af-4001-90ae-bb6067d68199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.397 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.822 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.822 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Ensure instance console log exists: /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.823 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.823 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.824 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.825 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.829 187156 WARNING nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.835 187156 DEBUG nova.virt.libvirt.host [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.835 187156 DEBUG nova.virt.libvirt.host [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.839 187156 DEBUG nova.virt.libvirt.host [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.839 187156 DEBUG nova.virt.libvirt.host [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.841 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.841 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.841 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.841 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.841 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.842 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.842 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.842 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.842 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.842 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.842 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.843 187156 DEBUG nova.virt.hardware [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.847 187156 DEBUG nova.objects.instance [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e03c289-84af-4001-90ae-bb6067d68199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:31:56 np0005539504 nova_compute[187152]: 2025-11-29 07:31:56.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:57 np0005539504 nova_compute[187152]: 2025-11-29 07:31:57.426 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:31:57 np0005539504 nova_compute[187152]: 2025-11-29 07:31:57.427 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 28.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:57 np0005539504 nova_compute[187152]: 2025-11-29 07:31:57.583 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <uuid>7e03c289-84af-4001-90ae-bb6067d68199</uuid>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <name>instance-00000091</name>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerShowV247Test-server-888677430</nova:name>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:31:56</nova:creationTime>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:user uuid="77f8dd72b85a42b2bd8a8cd644af5147">tempest-ServerShowV247Test-2069346219-project-member</nova:user>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:        <nova:project uuid="3a49efbb1d9f4c5abd9b9816d2c63823">tempest-ServerShowV247Test-2069346219</nova:project>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <entry name="serial">7e03c289-84af-4001-90ae-bb6067d68199</entry>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <entry name="uuid">7e03c289-84af-4001-90ae-bb6067d68199</entry>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.config"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/console.log" append="off"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:31:57 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:31:57 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:31:57 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:31:57 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:31:57 np0005539504 podman[241097]: 2025-11-29 07:31:57.74654813 +0000 UTC m=+0.075847391 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:31:57 np0005539504 podman[241098]: 2025-11-29 07:31:57.746429227 +0000 UTC m=+0.075882042 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git)
Nov 29 02:31:57 np0005539504 podman[241099]: 2025-11-29 07:31:57.747947788 +0000 UTC m=+0.070779043 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.104 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.107 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.107 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.108 187156 INFO nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Using config drive#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.403 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.407 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.409 187156 INFO nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Creating image(s)#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.410 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.410 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.412 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.454 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.550 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.553 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.554 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.579 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.647 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.649 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.715 187156 INFO nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Creating config drive at /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.config#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.724 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbuzzwl5o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:31:59 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:31:59.840 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:31:59 np0005539504 nova_compute[187152]: 2025-11-29 07:31:59.852 187156 DEBUG oslo_concurrency.processutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbuzzwl5o" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:31:59 np0005539504 systemd-machined[153423]: New machine qemu-74-instance-00000091.
Nov 29 02:31:59 np0005539504 systemd[1]: Started Virtual Machine qemu-74-instance-00000091.
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.428 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.430 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.431 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.471 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.473 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.474 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401520.4709861, 7e03c289-84af-4001-90ae-bb6067d68199 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.474 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.481 187156 INFO nova.virt.libvirt.driver [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Instance spawned successfully.#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.482 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.743 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk 1073741824" returned: 0 in 1.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.745 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.745 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.824 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.825 187156 DEBUG nova.virt.disk.api [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Checking if we can resize image /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.826 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.882 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.883 187156 DEBUG nova.virt.disk.api [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Cannot resize image /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:32:00 np0005539504 nova_compute[187152]: 2025-11-29 07:32:00.884 187156 DEBUG nova.objects.instance [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'migration_context' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.112 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.114 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.122 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.128 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.128 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.129 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.129 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.130 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.130 187156 DEBUG nova.virt.libvirt.driver [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.135 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.400 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.772 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.773 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Ensure instance console log exists: /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.774 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.775 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.776 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.778 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.786 187156 WARNING nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.809 187156 DEBUG nova.virt.libvirt.host [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.810 187156 DEBUG nova.virt.libvirt.host [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.814 187156 DEBUG nova.virt.libvirt.host [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.815 187156 DEBUG nova.virt.libvirt.host [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.816 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.817 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.817 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.818 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.818 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.818 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.819 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.819 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.819 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.820 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.820 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.821 187156 DEBUG nova.virt.hardware [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.825 187156 DEBUG nova.objects.instance [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'pci_devices' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:01 np0005539504 nova_compute[187152]: 2025-11-29 07:32:01.948 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:02 np0005539504 podman[241203]: 2025-11-29 07:32:02.743740209 +0000 UTC m=+0.079720235 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:32:02 np0005539504 podman[241204]: 2025-11-29 07:32:02.794686523 +0000 UTC m=+0.118060707 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.353 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.354 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401520.4728296, 7e03c289-84af-4001-90ae-bb6067d68199 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.355 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] VM Started (Lifecycle Event)#033[00m
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.358 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.358 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.358 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:32:03 np0005539504 nova_compute[187152]: 2025-11-29 07:32:03.358 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3835b666-929d-40c4-a556-3249ddef8b41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:04 np0005539504 nova_compute[187152]: 2025-11-29 07:32:04.405 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <uuid>e3d9dd73-abec-4339-9a8e-2781397f0e22</uuid>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <name>instance-00000092</name>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerShowV247Test-server-1254377581</nova:name>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:32:01</nova:creationTime>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:user uuid="77f8dd72b85a42b2bd8a8cd644af5147">tempest-ServerShowV247Test-2069346219-project-member</nova:user>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:        <nova:project uuid="3a49efbb1d9f4c5abd9b9816d2c63823">tempest-ServerShowV247Test-2069346219</nova:project>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <entry name="serial">e3d9dd73-abec-4339-9a8e-2781397f0e22</entry>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <entry name="uuid">e3d9dd73-abec-4339-9a8e-2781397f0e22</entry>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/console.log" append="off"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:32:04 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:32:04 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:32:04 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:32:04 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:32:04 np0005539504 nova_compute[187152]: 2025-11-29 07:32:04.609 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:04 np0005539504 nova_compute[187152]: 2025-11-29 07:32:04.618 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:32:04 np0005539504 nova_compute[187152]: 2025-11-29 07:32:04.793 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:32:05 np0005539504 nova_compute[187152]: 2025-11-29 07:32:05.097 187156 INFO nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Took 10.22 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:32:05 np0005539504 nova_compute[187152]: 2025-11-29 07:32:05.098 187156 DEBUG nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:05 np0005539504 nova_compute[187152]: 2025-11-29 07:32:05.215 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:32:05 np0005539504 nova_compute[187152]: 2025-11-29 07:32:05.216 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:32:05 np0005539504 nova_compute[187152]: 2025-11-29 07:32:05.217 187156 INFO nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Using config drive#033[00m
Nov 29 02:32:06 np0005539504 nova_compute[187152]: 2025-11-29 07:32:06.184 187156 INFO nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Creating config drive at /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config#033[00m
Nov 29 02:32:06 np0005539504 nova_compute[187152]: 2025-11-29 07:32:06.191 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiys13usf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:06 np0005539504 nova_compute[187152]: 2025-11-29 07:32:06.317 187156 DEBUG oslo_concurrency.processutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiys13usf" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:06 np0005539504 nova_compute[187152]: 2025-11-29 07:32:06.402 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:06 np0005539504 systemd-machined[153423]: New machine qemu-75-instance-00000092.
Nov 29 02:32:06 np0005539504 systemd[1]: Started Virtual Machine qemu-75-instance-00000092.
Nov 29 02:32:06 np0005539504 nova_compute[187152]: 2025-11-29 07:32:06.949 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.244 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401527.244327, e3d9dd73-abec-4339-9a8e-2781397f0e22 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.246 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.250 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.251 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.256 187156 INFO nova.virt.libvirt.driver [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance spawned successfully.#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.257 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.300 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.596 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.601 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.602 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.602 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.603 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.603 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.604 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.605 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.615 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.616 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.617 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.617 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.618 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:07 np0005539504 nova_compute[187152]: 2025-11-29 07:32:07.618 187156 DEBUG nova.virt.libvirt.driver [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:08 np0005539504 nova_compute[187152]: 2025-11-29 07:32:08.412 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:32:08 np0005539504 nova_compute[187152]: 2025-11-29 07:32:08.414 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401527.2468958, e3d9dd73-abec-4339-9a8e-2781397f0e22 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:08 np0005539504 nova_compute[187152]: 2025-11-29 07:32:08.414 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] VM Started (Lifecycle Event)#033[00m
Nov 29 02:32:08 np0005539504 nova_compute[187152]: 2025-11-29 07:32:08.519 187156 INFO nova.compute.manager [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Took 80.00 seconds to build instance.#033[00m
Nov 29 02:32:08 np0005539504 nova_compute[187152]: 2025-11-29 07:32:08.625 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:08 np0005539504 nova_compute[187152]: 2025-11-29 07:32:08.631 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.334 187156 DEBUG oslo_concurrency.lockutils [None req-09ea1e30-b9bd-412d-93ba-af2045278803 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "7e03c289-84af-4001-90ae-bb6067d68199" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 81.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.336 187156 INFO nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Took 9.93 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.337 187156 DEBUG nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.345 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:32:09 np0005539504 podman[241281]: 2025-11-29 07:32:09.759096303 +0000 UTC m=+0.085013929 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.778 187156 INFO nova.compute.manager [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Took 80.80 seconds to build instance.#033[00m
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.914 187156 DEBUG oslo_concurrency.lockutils [None req-5209d756-1841-413b-a833-333cd6990060 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "e3d9dd73-abec-4339-9a8e-2781397f0e22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 81.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:09.958 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:32:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:09.959 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:32:09 np0005539504 nova_compute[187152]: 2025-11-29 07:32:09.959 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:11 np0005539504 nova_compute[187152]: 2025-11-29 07:32:11.311 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:11 np0005539504 nova_compute[187152]: 2025-11-29 07:32:11.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:11 np0005539504 nova_compute[187152]: 2025-11-29 07:32:11.952 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:14 np0005539504 nova_compute[187152]: 2025-11-29 07:32:14.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:14 np0005539504 nova_compute[187152]: 2025-11-29 07:32:14.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:14 np0005539504 nova_compute[187152]: 2025-11-29 07:32:14.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:32:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:14.962 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:32:16 np0005539504 nova_compute[187152]: 2025-11-29 07:32:16.406 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:16 np0005539504 podman[241303]: 2025-11-29 07:32:16.77089224 +0000 UTC m=+0.104664092 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:32:16 np0005539504 nova_compute[187152]: 2025-11-29 07:32:16.965 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:17 np0005539504 nova_compute[187152]: 2025-11-29 07:32:17.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:21 np0005539504 nova_compute[187152]: 2025-11-29 07:32:21.408 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:21 np0005539504 nova_compute[187152]: 2025-11-29 07:32:21.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:21 np0005539504 nova_compute[187152]: 2025-11-29 07:32:21.967 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:22 np0005539504 nova_compute[187152]: 2025-11-29 07:32:22.539 187156 INFO nova.compute.manager [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Rebuilding instance#033[00m
Nov 29 02:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:22.978 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:22.979 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:22.981 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:23 np0005539504 nova_compute[187152]: 2025-11-29 07:32:23.137 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:23 np0005539504 nova_compute[187152]: 2025-11-29 07:32:23.138 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:23 np0005539504 nova_compute[187152]: 2025-11-29 07:32:23.138 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:23 np0005539504 nova_compute[187152]: 2025-11-29 07:32:23.139 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.410 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.742 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.771 187156 DEBUG nova.compute.manager [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.816 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.817 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.888 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.895 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:26 np0005539504 nova_compute[187152]: 2025-11-29 07:32:26.970 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.744 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk --force-share --output=json" returned: 0 in 0.849s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.746 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.844 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.856 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.932 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.934 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:27 np0005539504 nova_compute[187152]: 2025-11-29 07:32:27.994 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.163 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.164 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5254MB free_disk=72.99215316772461GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.165 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.165 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.437 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'pci_requests' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.597 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 3835b666-929d-40c4-a556-3249ddef8b41 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.598 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 7e03c289-84af-4001-90ae-bb6067d68199 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.598 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance e3d9dd73-abec-4339-9a8e-2781397f0e22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.599 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.599 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.606 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'pci_devices' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:28.659 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:f1:bd 2001:db8:0:1:f816:3eff:fe4a:f1bd 2001:db8::f816:3eff:fe4a:f1bd'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4a:f1bd/64 2001:db8::f816:3eff:fe4a:f1bd/64', 'neutron:device_id': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b0b0536c-6e35-42c5-8936-a1236a4f216e) old=Port_Binding(mac=['fa:16:3e:4a:f1:bd 2001:db8::f816:3eff:fe4a:f1bd'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4a:f1bd/64', 'neutron:device_id': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:32:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:28.661 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b0b0536c-6e35-42c5-8936-a1236a4f216e in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 updated#033[00m
Nov 29 02:32:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:28.663 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 716ed53e-cc56-4286-b418-2f5e02d33124, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:32:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:28.667 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b36b3978-d4ba-49fd-8e3a-34fa9cc23178]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:28 np0005539504 podman[241366]: 2025-11-29 07:32:28.709265687 +0000 UTC m=+0.056523076 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:32:28 np0005539504 podman[241367]: 2025-11-29 07:32:28.718555269 +0000 UTC m=+0.058295583 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public)
Nov 29 02:32:28 np0005539504 podman[241368]: 2025-11-29 07:32:28.724075849 +0000 UTC m=+0.053401201 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent)
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.760 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'resources' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:28 np0005539504 nova_compute[187152]: 2025-11-29 07:32:28.888 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'migration_context' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.007 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.012 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.543 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.565 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.600 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.601 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.601 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.602 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:32:29 np0005539504 nova_compute[187152]: 2025-11-29 07:32:29.618 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:30 np0005539504 nova_compute[187152]: 2025-11-29 07:32:30.632 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.030 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.030 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.031 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.329 187156 DEBUG nova.compute.manager [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-changed-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.331 187156 DEBUG nova.compute.manager [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Refreshing instance network info cache due to event network-changed-b95ddcc9-0165-4e0c-aa88-981010149da0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.332 187156 DEBUG oslo_concurrency.lockutils [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.332 187156 DEBUG oslo_concurrency.lockutils [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.332 187156 DEBUG nova.network.neutron [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Refreshing network info cache for port b95ddcc9-0165-4e0c-aa88-981010149da0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.406 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.413 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.730 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.731 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.732 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.733 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.733 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.817 187156 INFO nova.compute.manager [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Terminating instance#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.831 187156 DEBUG nova.compute.manager [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:32:31 np0005539504 kernel: tapb95ddcc9-01 (unregistering): left promiscuous mode
Nov 29 02:32:31 np0005539504 NetworkManager[55210]: <info>  [1764401551.8649] device (tapb95ddcc9-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:32:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:32:31Z|00575|binding|INFO|Releasing lport b95ddcc9-0165-4e0c-aa88-981010149da0 from this chassis (sb_readonly=0)
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.877 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:32:31Z|00576|binding|INFO|Setting lport b95ddcc9-0165-4e0c-aa88-981010149da0 down in Southbound
Nov 29 02:32:31 np0005539504 ovn_controller[95182]: 2025-11-29T07:32:31Z|00577|binding|INFO|Removing iface tapb95ddcc9-01 ovn-installed in OVS
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.883 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.897 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:31 np0005539504 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Nov 29 02:32:31 np0005539504 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000008d.scope: Consumed 20.873s CPU time.
Nov 29 02:32:31 np0005539504 systemd-machined[153423]: Machine qemu-73-instance-0000008d terminated.
Nov 29 02:32:31 np0005539504 nova_compute[187152]: 2025-11-29 07:32:31.970 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:32.083 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:6a:49 10.100.0.11'], port_security=['fa:16:3e:0a:6a:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3835b666-929d-40c4-a556-3249ddef8b41', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '459d88e1-2dc5-49aa-b8cd-08ecb466d94a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=404f48ed-a51b-481c-9673-9d093f66b931, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=b95ddcc9-0165-4e0c-aa88-981010149da0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:32:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:32.085 104164 INFO neutron.agent.ovn.metadata.agent [-] Port b95ddcc9-0165-4e0c-aa88-981010149da0 in datapath b998d842-14b5-466c-99db-e8ccb7fefb1d unbound from our chassis#033[00m
Nov 29 02:32:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:32.086 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b998d842-14b5-466c-99db-e8ccb7fefb1d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:32:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:32.088 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[89a1c029-67d3-4e09-9b01-392b8460bd80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:32 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:32.088 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d namespace which is not needed anymore#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.127 187156 INFO nova.virt.libvirt.driver [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Instance destroyed successfully.#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.128 187156 DEBUG nova.objects.instance [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 3835b666-929d-40c4-a556-3249ddef8b41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.162 187156 DEBUG nova.virt.libvirt.vif [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:29:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-586434876',display_name='tempest-TestNetworkBasicOps-server-586434876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-586434876',id=141,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGhKjzGKzbNLkKn3Qe4bhA/1onvLYvMUS8bVLCIcvhhnAzCyX0uhE0akbpZ/Bj7R/OWR0vQKuaY/lmcBYYwUBbB1+I8iLsoBy9IQ2OcenTKB8q8Qhex8xJkGRj6S++f6sg==',key_name='tempest-TestNetworkBasicOps-2072959861',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:29:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-t688yedl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:48Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=3835b666-929d-40c4-a556-3249ddef8b41,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.162 187156 DEBUG nova.network.os_vif_util [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.163 187156 DEBUG nova.network.os_vif_util [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.164 187156 DEBUG os_vif [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.168 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.169 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb95ddcc9-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.197 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.199 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.208 187156 INFO os_vif [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:6a:49,bridge_name='br-int',has_traffic_filtering=True,id=b95ddcc9-0165-4e0c-aa88-981010149da0,network=Network(b998d842-14b5-466c-99db-e8ccb7fefb1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb95ddcc9-01')#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.208 187156 INFO nova.virt.libvirt.driver [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Deleting instance files /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41_del#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.209 187156 INFO nova.virt.libvirt.driver [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Deletion of /var/lib/nova/instances/3835b666-929d-40c4-a556-3249ddef8b41_del complete#033[00m
Nov 29 02:32:32 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [NOTICE]   (240397) : haproxy version is 2.8.14-c23fe91
Nov 29 02:32:32 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [NOTICE]   (240397) : path to executable is /usr/sbin/haproxy
Nov 29 02:32:32 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [WARNING]  (240397) : Exiting Master process...
Nov 29 02:32:32 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [WARNING]  (240397) : Exiting Master process...
Nov 29 02:32:32 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [ALERT]    (240397) : Current worker (240399) exited with code 143 (Terminated)
Nov 29 02:32:32 np0005539504 neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d[240393]: [WARNING]  (240397) : All workers exited. Exiting... (0)
Nov 29 02:32:32 np0005539504 systemd[1]: libpod-7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3.scope: Deactivated successfully.
Nov 29 02:32:32 np0005539504 podman[241474]: 2025-11-29 07:32:32.289188221 +0000 UTC m=+0.055314253 container died 7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:32:32 np0005539504 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000092.scope: Deactivated successfully.
Nov 29 02:32:32 np0005539504 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000092.scope: Consumed 14.362s CPU time.
Nov 29 02:32:32 np0005539504 systemd-machined[153423]: Machine qemu-75-instance-00000092 terminated.
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.765 187156 DEBUG nova.compute.manager [req-b854c0b8-ba8a-4f7a-8eae-7bd610134062 req-e956fa3f-e3b8-4562-9d29-2eebcafbfaa1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-vif-unplugged-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.766 187156 DEBUG oslo_concurrency.lockutils [req-b854c0b8-ba8a-4f7a-8eae-7bd610134062 req-e956fa3f-e3b8-4562-9d29-2eebcafbfaa1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.766 187156 DEBUG oslo_concurrency.lockutils [req-b854c0b8-ba8a-4f7a-8eae-7bd610134062 req-e956fa3f-e3b8-4562-9d29-2eebcafbfaa1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.766 187156 DEBUG oslo_concurrency.lockutils [req-b854c0b8-ba8a-4f7a-8eae-7bd610134062 req-e956fa3f-e3b8-4562-9d29-2eebcafbfaa1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.767 187156 DEBUG nova.compute.manager [req-b854c0b8-ba8a-4f7a-8eae-7bd610134062 req-e956fa3f-e3b8-4562-9d29-2eebcafbfaa1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] No waiting events found dispatching network-vif-unplugged-b95ddcc9-0165-4e0c-aa88-981010149da0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.767 187156 DEBUG nova.compute.manager [req-b854c0b8-ba8a-4f7a-8eae-7bd610134062 req-e956fa3f-e3b8-4562-9d29-2eebcafbfaa1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-vif-unplugged-b95ddcc9-0165-4e0c-aa88-981010149da0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.773 187156 INFO nova.compute.manager [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Took 0.94 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.773 187156 DEBUG oslo.service.loopingcall [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.774 187156 DEBUG nova.compute.manager [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:32:32 np0005539504 nova_compute[187152]: 2025-11-29 07:32:32.774 187156 DEBUG nova.network.neutron [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:32:32 np0005539504 systemd[1]: var-lib-containers-storage-overlay-ed0a4709eb9caeb0a1ff342196593f7903c6c860abd842c4ed59ffa5aa21cd52-merged.mount: Deactivated successfully.
Nov 29 02:32:32 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3-userdata-shm.mount: Deactivated successfully.
Nov 29 02:32:32 np0005539504 podman[241474]: 2025-11-29 07:32:32.815272907 +0000 UTC m=+0.581398939 container cleanup 7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:32:32 np0005539504 systemd[1]: libpod-conmon-7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3.scope: Deactivated successfully.
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.046 187156 INFO nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance shutdown successfully after 4 seconds.#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.053 187156 INFO nova.virt.libvirt.driver [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance destroyed successfully.#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.057 187156 INFO nova.virt.libvirt.driver [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance destroyed successfully.#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.057 187156 INFO nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Deleting instance files /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22_del#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.058 187156 INFO nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Deletion of /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22_del complete#033[00m
Nov 29 02:32:33 np0005539504 podman[241512]: 2025-11-29 07:32:33.107837661 +0000 UTC m=+0.278128273 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 02:32:33 np0005539504 podman[241510]: 2025-11-29 07:32:33.135800141 +0000 UTC m=+0.312768124 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:32:33 np0005539504 podman[241519]: 2025-11-29 07:32:33.174593464 +0000 UTC m=+0.334830093 container remove 7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.181 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[76dc6208-0535-4355-a3ca-37ebe523e360]: (4, ('Sat Nov 29 07:32:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d (7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3)\n7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3\nSat Nov 29 07:32:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d (7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3)\n7ff48f0364ad85fda318aed62a156d99e29156b9e2e75c275fddc856c6ef50d3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.183 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b00ccbeb-1a86-4ec1-96a7-261f8b79a85a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.184 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb998d842-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.186 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:33 np0005539504 kernel: tapb998d842-10: left promiscuous mode
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.200 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.206 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[46c62796-b45a-4d97-9110-9757645e7095]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.225 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a2d67c-aa53-4484-ae0d-b8bc32acc2f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.226 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[56e1d748-9d0c-404e-bf98-2c39972b7edd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.242 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7da33e29-44de-41ea-a967-eb40023bc8a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685045, 'reachable_time': 30434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241573, 'error': None, 'target': 'ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 systemd[1]: run-netns-ovnmeta\x2db998d842\x2d14b5\x2d466c\x2d99db\x2de8ccb7fefb1d.mount: Deactivated successfully.
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.247 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b998d842-14b5-466c-99db-e8ccb7fefb1d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:32:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:32:33.247 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[7db96392-15c8-471a-a3a3-fb407343fe05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.956 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.956 187156 INFO nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Creating image(s)#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.957 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.957 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.958 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:33 np0005539504 nova_compute[187152]: 2025-11-29 07:32:33.974 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.036 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.037 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.038 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.051 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.116 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.117 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.364 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk 1073741824" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.365 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.365 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.440 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.442 187156 DEBUG nova.virt.disk.api [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Checking if we can resize image /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.443 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.513 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.514 187156 DEBUG nova.virt.disk.api [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Cannot resize image /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.515 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.515 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Ensure instance console log exists: /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.516 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.516 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.516 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.517 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.523 187156 WARNING nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.529 187156 DEBUG nova.virt.libvirt.host [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.530 187156 DEBUG nova.virt.libvirt.host [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.534 187156 DEBUG nova.virt.libvirt.host [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.534 187156 DEBUG nova.virt.libvirt.host [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.536 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.536 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.536 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.537 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.538 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.538 187156 DEBUG nova.virt.hardware [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.538 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'vcpu_model' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.606 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <uuid>e3d9dd73-abec-4339-9a8e-2781397f0e22</uuid>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <name>instance-00000092</name>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:name>tempest-ServerShowV247Test-server-1254377581</nova:name>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:32:34</nova:creationTime>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:user uuid="77f8dd72b85a42b2bd8a8cd644af5147">tempest-ServerShowV247Test-2069346219-project-member</nova:user>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:        <nova:project uuid="3a49efbb1d9f4c5abd9b9816d2c63823">tempest-ServerShowV247Test-2069346219</nova:project>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <nova:ports/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <entry name="serial">e3d9dd73-abec-4339-9a8e-2781397f0e22</entry>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <entry name="uuid">e3d9dd73-abec-4339-9a8e-2781397f0e22</entry>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/console.log" append="off"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:32:34 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:32:34 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:32:34 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:32:34 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.659 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.660 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.660 187156 INFO nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Using config drive#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.858 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'ec2_ids' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.874 187156 DEBUG nova.network.neutron [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updated VIF entry in instance network info cache for port b95ddcc9-0165-4e0c-aa88-981010149da0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:32:34 np0005539504 nova_compute[187152]: 2025-11-29 07:32:34.874 187156 DEBUG nova.network.neutron [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.399 187156 DEBUG nova.compute.manager [req-97cb5117-0ac3-4173-82f6-f1380422d5dc req-5e00dca4-5a9f-4d95-b72c-08ad3c470d0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.400 187156 DEBUG oslo_concurrency.lockutils [req-97cb5117-0ac3-4173-82f6-f1380422d5dc req-5e00dca4-5a9f-4d95-b72c-08ad3c470d0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3835b666-929d-40c4-a556-3249ddef8b41-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.400 187156 DEBUG oslo_concurrency.lockutils [req-97cb5117-0ac3-4173-82f6-f1380422d5dc req-5e00dca4-5a9f-4d95-b72c-08ad3c470d0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.400 187156 DEBUG oslo_concurrency.lockutils [req-97cb5117-0ac3-4173-82f6-f1380422d5dc req-5e00dca4-5a9f-4d95-b72c-08ad3c470d0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.401 187156 DEBUG nova.compute.manager [req-97cb5117-0ac3-4173-82f6-f1380422d5dc req-5e00dca4-5a9f-4d95-b72c-08ad3c470d0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] No waiting events found dispatching network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.401 187156 WARNING nova.compute.manager [req-97cb5117-0ac3-4173-82f6-f1380422d5dc req-5e00dca4-5a9f-4d95-b72c-08ad3c470d0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received unexpected event network-vif-plugged-b95ddcc9-0165-4e0c-aa88-981010149da0 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.446 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'keypairs' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.448 187156 DEBUG oslo_concurrency.lockutils [req-e6a7fb89-54df-420c-ad51-bcbbc141a25f req-4a32ca7d-de7a-4e5f-8365-d93a6003a60e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.449 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.449 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.449 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3835b666-929d-40c4-a556-3249ddef8b41 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.846 187156 INFO nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Creating config drive at /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.852 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvwdfm5r0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:35 np0005539504 nova_compute[187152]: 2025-11-29 07:32:35.978 187156 DEBUG oslo_concurrency.processutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvwdfm5r0" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:36 np0005539504 systemd-machined[153423]: New machine qemu-76-instance-00000092.
Nov 29 02:32:36 np0005539504 systemd[1]: Started Virtual Machine qemu-76-instance-00000092.
Nov 29 02:32:36 np0005539504 nova_compute[187152]: 2025-11-29 07:32:36.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.244 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.328 187156 DEBUG nova.network.neutron [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.365 187156 INFO nova.compute.manager [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Took 4.59 seconds to deallocate network for instance.#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.466 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.467 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.495 187156 DEBUG nova.compute.manager [req-e61d754e-b9ce-4486-bb03-8c24951d554d req-ef74da24-08b7-4672-baf2-2291c26f03a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Received event network-vif-deleted-b95ddcc9-0165-4e0c-aa88-981010149da0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.571 187156 DEBUG nova.compute.provider_tree [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.582 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for e3d9dd73-abec-4339-9a8e-2781397f0e22 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.584 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401557.5819461, e3d9dd73-abec-4339-9a8e-2781397f0e22 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.584 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.587 187156 DEBUG nova.compute.manager [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.588 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.589 187156 DEBUG nova.scheduler.client.report [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.599 187156 INFO nova.virt.libvirt.driver [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance spawned successfully.#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.600 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.615 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.619 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.626 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.632 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.632 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.633 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.633 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.633 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.634 187156 DEBUG nova.virt.libvirt.driver [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.645 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.646 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401557.583668, e3d9dd73-abec-4339-9a8e-2781397f0e22 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.646 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] VM Started (Lifecycle Event)#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.649 187156 INFO nova.scheduler.client.report [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 3835b666-929d-40c4-a556-3249ddef8b41#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.695 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:37 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:32:37 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.715 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.746 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.749 187156 DEBUG nova.compute.manager [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.780 187156 DEBUG oslo_concurrency.lockutils [None req-1afef1bb-37a6-42b8-9ee2-1ea14cb92829 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "3835b666-929d-40c4-a556-3249ddef8b41" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.856 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.856 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.857 187156 DEBUG nova.objects.instance [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:32:37 np0005539504 nova_compute[187152]: 2025-11-29 07:32:37.940 187156 DEBUG oslo_concurrency.lockutils [None req-da429e63-e19f-43a6-b036-7c3f4b548cb5 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:38 np0005539504 nova_compute[187152]: 2025-11-29 07:32:38.903 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updating instance_info_cache with network_info: [{"id": "b95ddcc9-0165-4e0c-aa88-981010149da0", "address": "fa:16:3e:0a:6a:49", "network": {"id": "b998d842-14b5-466c-99db-e8ccb7fefb1d", "bridge": "br-int", "label": "tempest-network-smoke--686425119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb95ddcc9-01", "ovs_interfaceid": "b95ddcc9-0165-4e0c-aa88-981010149da0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:38 np0005539504 nova_compute[187152]: 2025-11-29 07:32:38.925 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-3835b666-929d-40c4-a556-3249ddef8b41" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:32:38 np0005539504 nova_compute[187152]: 2025-11-29 07:32:38.926 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:32:38 np0005539504 nova_compute[187152]: 2025-11-29 07:32:38.927 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:38 np0005539504 nova_compute[187152]: 2025-11-29 07:32:38.927 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:39 np0005539504 nova_compute[187152]: 2025-11-29 07:32:39.985 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "e3d9dd73-abec-4339-9a8e-2781397f0e22" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:39 np0005539504 nova_compute[187152]: 2025-11-29 07:32:39.986 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "e3d9dd73-abec-4339-9a8e-2781397f0e22" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:39 np0005539504 nova_compute[187152]: 2025-11-29 07:32:39.986 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "e3d9dd73-abec-4339-9a8e-2781397f0e22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:39 np0005539504 nova_compute[187152]: 2025-11-29 07:32:39.986 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "e3d9dd73-abec-4339-9a8e-2781397f0e22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:39 np0005539504 nova_compute[187152]: 2025-11-29 07:32:39.987 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "e3d9dd73-abec-4339-9a8e-2781397f0e22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:39 np0005539504 nova_compute[187152]: 2025-11-29 07:32:39.997 187156 INFO nova.compute.manager [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Terminating instance#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.008 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "refresh_cache-e3d9dd73-abec-4339-9a8e-2781397f0e22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.009 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquired lock "refresh_cache-e3d9dd73-abec-4339-9a8e-2781397f0e22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.009 187156 DEBUG nova.network.neutron [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.219 187156 DEBUG nova.network.neutron [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.532 187156 DEBUG nova.network.neutron [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.548 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Releasing lock "refresh_cache-e3d9dd73-abec-4339-9a8e-2781397f0e22" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.549 187156 DEBUG nova.compute.manager [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:32:40 np0005539504 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000092.scope: Deactivated successfully.
Nov 29 02:32:40 np0005539504 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000092.scope: Consumed 4.501s CPU time.
Nov 29 02:32:40 np0005539504 systemd-machined[153423]: Machine qemu-76-instance-00000092 terminated.
Nov 29 02:32:40 np0005539504 podman[241618]: 2025-11-29 07:32:40.681375982 +0000 UTC m=+0.090063736 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.795 187156 INFO nova.virt.libvirt.driver [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance destroyed successfully.#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.796 187156 DEBUG nova.objects.instance [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'resources' on Instance uuid e3d9dd73-abec-4339-9a8e-2781397f0e22 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.810 187156 INFO nova.virt.libvirt.driver [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Deleting instance files /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22_del#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.810 187156 INFO nova.virt.libvirt.driver [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Deletion of /var/lib/nova/instances/e3d9dd73-abec-4339-9a8e-2781397f0e22_del complete#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.888 187156 INFO nova.compute.manager [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.889 187156 DEBUG oslo.service.loopingcall [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.889 187156 DEBUG nova.compute.manager [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:32:40 np0005539504 nova_compute[187152]: 2025-11-29 07:32:40.889 187156 DEBUG nova.network.neutron [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:32:41 np0005539504 nova_compute[187152]: 2025-11-29 07:32:41.839 187156 DEBUG nova.network.neutron [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:32:41 np0005539504 nova_compute[187152]: 2025-11-29 07:32:41.859 187156 DEBUG nova.network.neutron [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:41 np0005539504 nova_compute[187152]: 2025-11-29 07:32:41.878 187156 INFO nova.compute.manager [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Took 0.99 seconds to deallocate network for instance.#033[00m
Nov 29 02:32:41 np0005539504 nova_compute[187152]: 2025-11-29 07:32:41.976 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:41 np0005539504 nova_compute[187152]: 2025-11-29 07:32:41.980 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:41 np0005539504 nova_compute[187152]: 2025-11-29 07:32:41.980 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:42 np0005539504 nova_compute[187152]: 2025-11-29 07:32:42.080 187156 DEBUG nova.compute.provider_tree [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:42 np0005539504 nova_compute[187152]: 2025-11-29 07:32:42.246 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:42 np0005539504 nova_compute[187152]: 2025-11-29 07:32:42.338 187156 DEBUG nova.scheduler.client.report [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:42 np0005539504 nova_compute[187152]: 2025-11-29 07:32:42.380 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:42 np0005539504 nova_compute[187152]: 2025-11-29 07:32:42.413 187156 INFO nova.scheduler.client.report [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Deleted allocations for instance e3d9dd73-abec-4339-9a8e-2781397f0e22#033[00m
Nov 29 02:32:42 np0005539504 nova_compute[187152]: 2025-11-29 07:32:42.494 187156 DEBUG oslo_concurrency.lockutils [None req-c6571c80-ca4d-46ff-867f-73be0374fd57 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "e3d9dd73-abec-4339-9a8e-2781397f0e22" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.183 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "7e03c289-84af-4001-90ae-bb6067d68199" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.184 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "7e03c289-84af-4001-90ae-bb6067d68199" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.184 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "7e03c289-84af-4001-90ae-bb6067d68199-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.185 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "7e03c289-84af-4001-90ae-bb6067d68199-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.185 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "7e03c289-84af-4001-90ae-bb6067d68199-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.198 187156 INFO nova.compute.manager [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Terminating instance#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.213 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "refresh_cache-7e03c289-84af-4001-90ae-bb6067d68199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.214 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquired lock "refresh_cache-7e03c289-84af-4001-90ae-bb6067d68199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.214 187156 DEBUG nova.network.neutron [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:32:43 np0005539504 nova_compute[187152]: 2025-11-29 07:32:43.891 187156 DEBUG nova.network.neutron [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:32:46 np0005539504 nova_compute[187152]: 2025-11-29 07:32:46.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.076 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.124 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401552.1228268, 3835b666-929d-40c4-a556-3249ddef8b41 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.125 187156 INFO nova.compute.manager [-] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.187 187156 DEBUG nova.network.neutron [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.249 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.292 187156 DEBUG nova.compute.manager [None req-45dd3a4f-4f66-4154-844b-78590f638dbc - - - - - -] [instance: 3835b666-929d-40c4-a556-3249ddef8b41] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.293 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Releasing lock "refresh_cache-7e03c289-84af-4001-90ae-bb6067d68199" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.293 187156 DEBUG nova.compute.manager [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.350 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539504 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000091.scope: Deactivated successfully.
Nov 29 02:32:47 np0005539504 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000091.scope: Consumed 14.846s CPU time.
Nov 29 02:32:47 np0005539504 systemd-machined[153423]: Machine qemu-74-instance-00000091 terminated.
Nov 29 02:32:47 np0005539504 podman[241654]: 2025-11-29 07:32:47.408686284 +0000 UTC m=+0.055021575 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.561 187156 INFO nova.virt.libvirt.driver [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Instance destroyed successfully.#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.562 187156 DEBUG nova.objects.instance [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lazy-loading 'resources' on Instance uuid 7e03c289-84af-4001-90ae-bb6067d68199 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.895 187156 INFO nova.virt.libvirt.driver [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Deleting instance files /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199_del#033[00m
Nov 29 02:32:47 np0005539504 nova_compute[187152]: 2025-11-29 07:32:47.896 187156 INFO nova.virt.libvirt.driver [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Deletion of /var/lib/nova/instances/7e03c289-84af-4001-90ae-bb6067d68199_del complete#033[00m
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.980 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7e03c289-84af-4001-90ae-bb6067d68199', 'name': 'tempest-ServerShowV247Test-server-888677430', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000091', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '3a49efbb1d9f4c5abd9b9816d2c63823', 'user_id': '77f8dd72b85a42b2bd8a8cd644af5147', 'hostId': '7ec8b218a4eba998d2b64474ebf3f4dd0670fd62fb2a4495cfcd47bd', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.982 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.983 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.984 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.985 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.985 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.985 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.986 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.987 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.987 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.988 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.988 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>]
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.989 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.990 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.991 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.991 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.992 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.993 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.993 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.994 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.994 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.994 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.994 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>]
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.995 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.997 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.997 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.997 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>]
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.998 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.999 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:47.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.000 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.000 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.000 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerShowV247Test-server-888677430>]
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.001 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:32:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:32:48.002 12 DEBUG ceilometer.compute.pollsters [-] Instance 7e03c289-84af-4001-90ae-bb6067d68199 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000091, id=7e03c289-84af-4001-90ae-bb6067d68199>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.424 187156 INFO nova.compute.manager [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Took 1.13 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.424 187156 DEBUG oslo.service.loopingcall [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.425 187156 DEBUG nova.compute.manager [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.425 187156 DEBUG nova.network.neutron [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.619 187156 DEBUG nova.network.neutron [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.751 187156 DEBUG nova.network.neutron [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:48 np0005539504 nova_compute[187152]: 2025-11-29 07:32:48.850 187156 INFO nova.compute.manager [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Took 0.42 seconds to deallocate network for instance.#033[00m
Nov 29 02:32:51 np0005539504 nova_compute[187152]: 2025-11-29 07:32:51.980 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:52 np0005539504 nova_compute[187152]: 2025-11-29 07:32:52.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:55 np0005539504 nova_compute[187152]: 2025-11-29 07:32:55.794 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401560.792956, e3d9dd73-abec-4339-9a8e-2781397f0e22 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:55 np0005539504 nova_compute[187152]: 2025-11-29 07:32:55.794 187156 INFO nova.compute.manager [-] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:32:55 np0005539504 nova_compute[187152]: 2025-11-29 07:32:55.969 187156 DEBUG nova.compute.manager [None req-df8ed162-ec7e-4af1-9a87-c88627b8f1c5 - - - - - -] [instance: e3d9dd73-abec-4339-9a8e-2781397f0e22] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:56 np0005539504 nova_compute[187152]: 2025-11-29 07:32:56.051 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:56 np0005539504 nova_compute[187152]: 2025-11-29 07:32:56.052 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:56 np0005539504 nova_compute[187152]: 2025-11-29 07:32:56.137 187156 DEBUG nova.compute.provider_tree [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:56 np0005539504 nova_compute[187152]: 2025-11-29 07:32:56.158 187156 DEBUG nova.scheduler.client.report [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:56 np0005539504 nova_compute[187152]: 2025-11-29 07:32:56.981 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:57 np0005539504 nova_compute[187152]: 2025-11-29 07:32:57.025 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.973s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:57 np0005539504 nova_compute[187152]: 2025-11-29 07:32:57.057 187156 INFO nova.scheduler.client.report [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Deleted allocations for instance 7e03c289-84af-4001-90ae-bb6067d68199#033[00m
Nov 29 02:32:57 np0005539504 nova_compute[187152]: 2025-11-29 07:32:57.164 187156 DEBUG oslo_concurrency.lockutils [None req-6945ea08-9ea1-4ab1-89a2-f9622360fc58 77f8dd72b85a42b2bd8a8cd644af5147 3a49efbb1d9f4c5abd9b9816d2c63823 - - default default] Lock "7e03c289-84af-4001-90ae-bb6067d68199" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:57 np0005539504 nova_compute[187152]: 2025-11-29 07:32:57.293 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:58 np0005539504 nova_compute[187152]: 2025-11-29 07:32:58.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:58 np0005539504 nova_compute[187152]: 2025-11-29 07:32:58.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:32:59 np0005539504 nova_compute[187152]: 2025-11-29 07:32:59.292 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:32:59 np0005539504 podman[241686]: 2025-11-29 07:32:59.738500784 +0000 UTC m=+0.073623469 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:32:59 np0005539504 podman[241687]: 2025-11-29 07:32:59.748685811 +0000 UTC m=+0.084601348 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 29 02:32:59 np0005539504 podman[241688]: 2025-11-29 07:32:59.76928131 +0000 UTC m=+0.104189670 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:33:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:00.010 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:00.012 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:33:00 np0005539504 nova_compute[187152]: 2025-11-29 07:33:00.012 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:01 np0005539504 nova_compute[187152]: 2025-11-29 07:33:01.983 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:02 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:02.014 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:02 np0005539504 nova_compute[187152]: 2025-11-29 07:33:02.294 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:02 np0005539504 nova_compute[187152]: 2025-11-29 07:33:02.559 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401567.5576708, 7e03c289-84af-4001-90ae-bb6067d68199 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:02 np0005539504 nova_compute[187152]: 2025-11-29 07:33:02.560 187156 INFO nova.compute.manager [-] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:33:02 np0005539504 nova_compute[187152]: 2025-11-29 07:33:02.591 187156 DEBUG nova.compute.manager [None req-ed757e6e-1983-4b1c-a2f6-5711b810b818 - - - - - -] [instance: 7e03c289-84af-4001-90ae-bb6067d68199] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:03 np0005539504 podman[241746]: 2025-11-29 07:33:03.721810692 +0000 UTC m=+0.061393328 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:33:03 np0005539504 podman[241747]: 2025-11-29 07:33:03.761748277 +0000 UTC m=+0.093211753 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 29 02:33:04 np0005539504 nova_compute[187152]: 2025-11-29 07:33:04.292 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:06 np0005539504 nova_compute[187152]: 2025-11-29 07:33:06.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:07 np0005539504 nova_compute[187152]: 2025-11-29 07:33:07.336 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:07 np0005539504 nova_compute[187152]: 2025-11-29 07:33:07.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:11 np0005539504 podman[241800]: 2025-11-29 07:33:11.72099057 +0000 UTC m=+0.058081798 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:33:11 np0005539504 nova_compute[187152]: 2025-11-29 07:33:11.988 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539504 nova_compute[187152]: 2025-11-29 07:33:12.338 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:13 np0005539504 nova_compute[187152]: 2025-11-29 07:33:13.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:15 np0005539504 nova_compute[187152]: 2025-11-29 07:33:15.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:15 np0005539504 nova_compute[187152]: 2025-11-29 07:33:15.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:33:16 np0005539504 nova_compute[187152]: 2025-11-29 07:33:16.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:17 np0005539504 nova_compute[187152]: 2025-11-29 07:33:17.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:17 np0005539504 podman[241820]: 2025-11-29 07:33:17.720993392 +0000 UTC m=+0.062922229 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:33:19 np0005539504 nova_compute[187152]: 2025-11-29 07:33:19.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:21 np0005539504 systemd[1]: Starting dnf makecache...
Nov 29 02:33:21 np0005539504 dnf[241841]: Metadata cache refreshed recently.
Nov 29 02:33:21 np0005539504 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 02:33:21 np0005539504 systemd[1]: Finished dnf makecache.
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.343 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.346 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.346 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:22 np0005539504 nova_compute[187152]: 2025-11-29 07:33:22.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:22.978 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:22.978 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:22.978 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:23 np0005539504 nova_compute[187152]: 2025-11-29 07:33:23.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:23 np0005539504 nova_compute[187152]: 2025-11-29 07:33:23.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:23 np0005539504 nova_compute[187152]: 2025-11-29 07:33:23.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:23 np0005539504 nova_compute[187152]: 2025-11-29 07:33:23.965 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:23 np0005539504 nova_compute[187152]: 2025-11-29 07:33:23.965 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.193 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.194 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5698MB free_disk=73.07469177246094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.195 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.195 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.274 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.274 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.296 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.309 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.380 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:33:24 np0005539504 nova_compute[187152]: 2025-11-29 07:33:24.381 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:25 np0005539504 nova_compute[187152]: 2025-11-29 07:33:25.380 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:25 np0005539504 nova_compute[187152]: 2025-11-29 07:33:25.381 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:33:25 np0005539504 nova_compute[187152]: 2025-11-29 07:33:25.436 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:33:26 np0005539504 nova_compute[187152]: 2025-11-29 07:33:26.993 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.351 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.360 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.361 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.390 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.626 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.626 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.632 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.632 187156 INFO nova.compute.claims [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.758 187156 DEBUG nova.compute.provider_tree [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.770 187156 DEBUG nova.scheduler.client.report [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.791 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.792 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.863 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.863 187156 DEBUG nova.network.neutron [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.883 187156 INFO nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:33:27 np0005539504 nova_compute[187152]: 2025-11-29 07:33:27.905 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.051 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.052 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.053 187156 INFO nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Creating image(s)#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.053 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.053 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.054 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.067 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.098 187156 DEBUG nova.policy [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.171 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.172 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.173 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.186 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.257 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.258 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.294 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.295 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.296 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.372 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.373 187156 DEBUG nova.virt.disk.api [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.374 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.435 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.437 187156 DEBUG nova.virt.disk.api [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.438 187156 DEBUG nova.objects.instance [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.457 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.458 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Ensure instance console log exists: /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.459 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.460 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.460 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:28 np0005539504 nova_compute[187152]: 2025-11-29 07:33:28.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:30 np0005539504 podman[241860]: 2025-11-29 07:33:30.713349812 +0000 UTC m=+0.052250820 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:33:30 np0005539504 podman[241858]: 2025-11-29 07:33:30.716911198 +0000 UTC m=+0.061616304 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:33:30 np0005539504 podman[241859]: 2025-11-29 07:33:30.725131311 +0000 UTC m=+0.063304599 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Nov 29 02:33:31 np0005539504 nova_compute[187152]: 2025-11-29 07:33:31.126 187156 DEBUG nova.network.neutron [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Successfully created port: c31397b6-7050-41a8-be27-e6dce253a00f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:33:31 np0005539504 nova_compute[187152]: 2025-11-29 07:33:31.937 187156 DEBUG nova.network.neutron [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Successfully updated port: c31397b6-7050-41a8-be27-e6dce253a00f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:33:31 np0005539504 nova_compute[187152]: 2025-11-29 07:33:31.960 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:31 np0005539504 nova_compute[187152]: 2025-11-29 07:33:31.960 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:31 np0005539504 nova_compute[187152]: 2025-11-29 07:33:31.960 187156 DEBUG nova.network.neutron [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:33:31 np0005539504 nova_compute[187152]: 2025-11-29 07:33:31.994 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:32 np0005539504 nova_compute[187152]: 2025-11-29 07:33:32.057 187156 DEBUG nova.compute.manager [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-changed-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:32 np0005539504 nova_compute[187152]: 2025-11-29 07:33:32.057 187156 DEBUG nova.compute.manager [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing instance network info cache due to event network-changed-c31397b6-7050-41a8-be27-e6dce253a00f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:33:32 np0005539504 nova_compute[187152]: 2025-11-29 07:33:32.058 187156 DEBUG oslo_concurrency.lockutils [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:32 np0005539504 nova_compute[187152]: 2025-11-29 07:33:32.223 187156 DEBUG nova.network.neutron [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:33:32 np0005539504 nova_compute[187152]: 2025-11-29 07:33:32.354 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:34 np0005539504 podman[241918]: 2025-11-29 07:33:34.752443965 +0000 UTC m=+0.081719831 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:33:34 np0005539504 podman[241919]: 2025-11-29 07:33:34.773502377 +0000 UTC m=+0.097828228 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.689 187156 DEBUG nova.network.neutron [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.709 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.710 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Instance network_info: |[{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.710 187156 DEBUG oslo_concurrency.lockutils [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.711 187156 DEBUG nova.network.neutron [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing network info cache for port c31397b6-7050-41a8-be27-e6dce253a00f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.714 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Start _get_guest_xml network_info=[{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.720 187156 WARNING nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.725 187156 DEBUG nova.virt.libvirt.host [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.726 187156 DEBUG nova.virt.libvirt.host [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.734 187156 DEBUG nova.virt.libvirt.host [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.735 187156 DEBUG nova.virt.libvirt.host [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.737 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.737 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.738 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.738 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.738 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.739 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.739 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.739 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.740 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.740 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.740 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.741 187156 DEBUG nova.virt.hardware [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.745 187156 DEBUG nova.virt.libvirt.vif [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:33:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.745 187156 DEBUG nova.network.os_vif_util [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.746 187156 DEBUG nova.network.os_vif_util [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.747 187156 DEBUG nova.objects.instance [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.761 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <uuid>949386c1-7dd6-4ddb-89b1-4762db3984dd</uuid>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <name>instance-00000094</name>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:33:35</nova:creationTime>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <entry name="serial">949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <entry name="uuid">949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:37:7b:df"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <target dev="tapc31397b6-70"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log" append="off"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:33:35 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:33:35 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:33:35 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:33:35 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.763 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Preparing to wait for external event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.763 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.763 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.764 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.764 187156 DEBUG nova.virt.libvirt.vif [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:33:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.765 187156 DEBUG nova.network.os_vif_util [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.765 187156 DEBUG nova.network.os_vif_util [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.765 187156 DEBUG os_vif [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.766 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.766 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.767 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.771 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.771 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc31397b6-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.772 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc31397b6-70, col_values=(('external_ids', {'iface-id': 'c31397b6-7050-41a8-be27-e6dce253a00f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:7b:df', 'vm-uuid': '949386c1-7dd6-4ddb-89b1-4762db3984dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.774 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:35 np0005539504 NetworkManager[55210]: <info>  [1764401615.7751] manager: (tapc31397b6-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.775 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.786 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.788 187156 INFO os_vif [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70')#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.854 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.855 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.855 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:37:7b:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:33:35 np0005539504 nova_compute[187152]: 2025-11-29 07:33:35.856 187156 INFO nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Using config drive#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.188 187156 INFO nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Creating config drive at /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.193 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6l4b3gqn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.318 187156 DEBUG oslo_concurrency.processutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6l4b3gqn" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:36 np0005539504 kernel: tapc31397b6-70: entered promiscuous mode
Nov 29 02:33:36 np0005539504 NetworkManager[55210]: <info>  [1764401616.3732] manager: (tapc31397b6-70): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Nov 29 02:33:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:36Z|00578|binding|INFO|Claiming lport c31397b6-7050-41a8-be27-e6dce253a00f for this chassis.
Nov 29 02:33:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:36Z|00579|binding|INFO|c31397b6-7050-41a8-be27-e6dce253a00f: Claiming fa:16:3e:37:7b:df 10.100.0.3
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.374 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.382 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.396 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:7b:df 10.100.0.3'], port_security=['fa:16:3e:37:7b:df 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb20b368-92db-4c1f-add7-febda0cc12c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6630ec7d-a4a7-4251-aa89-7d94b5d92822', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d948d8aa-7d6c-4897-8da2-928d861f5745, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=c31397b6-7050-41a8-be27-e6dce253a00f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.397 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c31397b6-7050-41a8-be27-e6dce253a00f in datapath fb20b368-92db-4c1f-add7-febda0cc12c8 bound to our chassis#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.399 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fb20b368-92db-4c1f-add7-febda0cc12c8#033[00m
Nov 29 02:33:36 np0005539504 systemd-machined[153423]: New machine qemu-77-instance-00000094.
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.409 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc0f62d-d834-4b63-95c1-512dec692ea8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.410 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfb20b368-91 in ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:33:36 np0005539504 systemd-udevd[241987]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.412 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfb20b368-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.412 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed06ad7-9cf7-4f78-88f7-12647f5af2fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.412 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[61117ff0-780f-4b00-b1bf-54f644bf05a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 NetworkManager[55210]: <info>  [1764401616.4229] device (tapc31397b6-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:33:36 np0005539504 NetworkManager[55210]: <info>  [1764401616.4238] device (tapc31397b6-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.425 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb53070-52f6-4506-9cc4-8510e9e3d327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 systemd[1]: Started Virtual Machine qemu-77-instance-00000094.
Nov 29 02:33:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:36Z|00580|binding|INFO|Setting lport c31397b6-7050-41a8-be27-e6dce253a00f ovn-installed in OVS
Nov 29 02:33:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:36Z|00581|binding|INFO|Setting lport c31397b6-7050-41a8-be27-e6dce253a00f up in Southbound
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.438 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.441 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f00392f2-5e29-4c25-92c5-4da0ac47ce46]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.470 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f72b71c3-ee43-4980-ae4a-c20109723650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 systemd-udevd[241990]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:33:36 np0005539504 NetworkManager[55210]: <info>  [1764401616.4765] manager: (tapfb20b368-90): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.476 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f6d8c6-9738-46ab-b297-4a8517d1ff61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.504 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef75535-9cea-4b80-8475-0f0d4d265457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.512 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[58ac80ee-54d2-496f-8658-9403931afeed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 NetworkManager[55210]: <info>  [1764401616.5392] device (tapfb20b368-90): carrier: link connected
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.543 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[72e35f20-aea1-4b6c-955b-441a4ebeb28b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.567 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4920ccd6-5f3d-45a7-8fdf-2ba2fdc1a333]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb20b368-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:28:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 707941, 'reachable_time': 33081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242019, 'error': None, 'target': 'ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.586 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3d1e516e-8bb9-49b2-b327-409cab74a1af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:2837'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 707941, 'tstamp': 707941}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242020, 'error': None, 'target': 'ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.605 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[61fd3bd9-8df3-479e-a08c-cbe3838fe8b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb20b368-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:28:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 707941, 'reachable_time': 33081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242021, 'error': None, 'target': 'ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.640 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c5495e83-c390-4671-917a-41d2b27d3af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.693 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1aa7d4fa-32d4-48fd-9999-fbc4b3acedc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.695 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb20b368-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.696 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.696 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb20b368-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.699 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 kernel: tapfb20b368-90: entered promiscuous mode
Nov 29 02:33:36 np0005539504 NetworkManager[55210]: <info>  [1764401616.6998] manager: (tapfb20b368-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.703 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfb20b368-90, col_values=(('external_ids', {'iface-id': '7ec9f3ca-4f03-4ca2-a4be-c55f9239510e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.704 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:36Z|00582|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.723 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.724 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fb20b368-92db-4c1f-add7-febda0cc12c8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fb20b368-92db-4c1f-add7-febda0cc12c8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.725 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7cea6b84-fd52-483f-9c8a-2585dd23e36d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.726 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-fb20b368-92db-4c1f-add7-febda0cc12c8
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/fb20b368-92db-4c1f-add7-febda0cc12c8.pid.haproxy
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID fb20b368-92db-4c1f-add7-febda0cc12c8
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:33:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:36.727 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8', 'env', 'PROCESS_TAG=haproxy-fb20b368-92db-4c1f-add7-febda0cc12c8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fb20b368-92db-4c1f-add7-febda0cc12c8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.755 187156 DEBUG nova.compute.manager [req-21926994-5acc-4907-9a87-287a7d4d1cdd req-e8ec5393-4ad2-4af4-bace-f5d9a86df572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.756 187156 DEBUG oslo_concurrency.lockutils [req-21926994-5acc-4907-9a87-287a7d4d1cdd req-e8ec5393-4ad2-4af4-bace-f5d9a86df572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.758 187156 DEBUG oslo_concurrency.lockutils [req-21926994-5acc-4907-9a87-287a7d4d1cdd req-e8ec5393-4ad2-4af4-bace-f5d9a86df572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.758 187156 DEBUG oslo_concurrency.lockutils [req-21926994-5acc-4907-9a87-287a7d4d1cdd req-e8ec5393-4ad2-4af4-bace-f5d9a86df572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.758 187156 DEBUG nova.compute.manager [req-21926994-5acc-4907-9a87-287a7d4d1cdd req-e8ec5393-4ad2-4af4-bace-f5d9a86df572 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Processing event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.908 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401616.9077652, 949386c1-7dd6-4ddb-89b1-4762db3984dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.908 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] VM Started (Lifecycle Event)#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.911 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.917 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.922 187156 INFO nova.virt.libvirt.driver [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Instance spawned successfully.#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.922 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.938 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.942 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.950 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.951 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.952 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.953 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.954 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.955 187156 DEBUG nova.virt.libvirt.driver [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.992 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.993 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401616.9079103, 949386c1-7dd6-4ddb-89b1-4762db3984dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.993 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:33:36 np0005539504 nova_compute[187152]: 2025-11-29 07:33:36.998 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.023 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.028 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401616.916605, 949386c1-7dd6-4ddb-89b1-4762db3984dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.029 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.062 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.068 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.077 187156 INFO nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Took 9.03 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.078 187156 DEBUG nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.111 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:33:37 np0005539504 podman[242059]: 2025-11-29 07:33:37.181482355 +0000 UTC m=+0.061309046 container create 313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.184 187156 INFO nova.compute.manager [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Took 9.73 seconds to build instance.#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.217 187156 DEBUG oslo_concurrency.lockutils [None req-884211a2-a4fb-4168-a665-2660f1710b30 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:37 np0005539504 systemd[1]: Started libpod-conmon-313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a.scope.
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.228 187156 DEBUG nova.network.neutron [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updated VIF entry in instance network info cache for port c31397b6-7050-41a8-be27-e6dce253a00f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.229 187156 DEBUG nova.network.neutron [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:37 np0005539504 podman[242059]: 2025-11-29 07:33:37.149755433 +0000 UTC m=+0.029582124 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:33:37 np0005539504 nova_compute[187152]: 2025-11-29 07:33:37.250 187156 DEBUG oslo_concurrency.lockutils [req-36eb3591-5b15-4211-a962-17148268a38d req-7a5eb0af-4dae-4562-99b1-290f4fe6646d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:37 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:33:37 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3038d3eec37997db500d51ddf12631cd2e78c0949eba3dd0dc01b9d0c38ff7a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:33:37 np0005539504 podman[242059]: 2025-11-29 07:33:37.278954841 +0000 UTC m=+0.158781552 container init 313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:33:37 np0005539504 podman[242059]: 2025-11-29 07:33:37.285798067 +0000 UTC m=+0.165624758 container start 313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:33:37 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [NOTICE]   (242078) : New worker (242080) forked
Nov 29 02:33:37 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [NOTICE]   (242078) : Loading success.
Nov 29 02:33:38 np0005539504 nova_compute[187152]: 2025-11-29 07:33:38.943 187156 DEBUG nova.compute.manager [req-07af16e1-8831-4419-a119-0e2da1c6bcca req-3638f4fb-8b51-4b55-ae49-11ddfae354ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:38 np0005539504 nova_compute[187152]: 2025-11-29 07:33:38.944 187156 DEBUG oslo_concurrency.lockutils [req-07af16e1-8831-4419-a119-0e2da1c6bcca req-3638f4fb-8b51-4b55-ae49-11ddfae354ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:38 np0005539504 nova_compute[187152]: 2025-11-29 07:33:38.945 187156 DEBUG oslo_concurrency.lockutils [req-07af16e1-8831-4419-a119-0e2da1c6bcca req-3638f4fb-8b51-4b55-ae49-11ddfae354ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:38 np0005539504 nova_compute[187152]: 2025-11-29 07:33:38.945 187156 DEBUG oslo_concurrency.lockutils [req-07af16e1-8831-4419-a119-0e2da1c6bcca req-3638f4fb-8b51-4b55-ae49-11ddfae354ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:38 np0005539504 nova_compute[187152]: 2025-11-29 07:33:38.945 187156 DEBUG nova.compute.manager [req-07af16e1-8831-4419-a119-0e2da1c6bcca req-3638f4fb-8b51-4b55-ae49-11ddfae354ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:33:38 np0005539504 nova_compute[187152]: 2025-11-29 07:33:38.945 187156 WARNING nova.compute.manager [req-07af16e1-8831-4419-a119-0e2da1c6bcca req-3638f4fb-8b51-4b55-ae49-11ddfae354ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received unexpected event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f for instance with vm_state active and task_state None.#033[00m
Nov 29 02:33:39 np0005539504 nova_compute[187152]: 2025-11-29 07:33:39.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:39.914 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:39.918 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.052 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.076 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.077 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.077 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.103 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.775 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:40 np0005539504 nova_compute[187152]: 2025-11-29 07:33:40.823 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:40 np0005539504 NetworkManager[55210]: <info>  [1764401620.8241] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Nov 29 02:33:40 np0005539504 NetworkManager[55210]: <info>  [1764401620.8248] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.108 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:41 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:41Z|00583|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.143 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.456 187156 DEBUG nova.compute.manager [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-changed-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.456 187156 DEBUG nova.compute.manager [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing instance network info cache due to event network-changed-c31397b6-7050-41a8-be27-e6dce253a00f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.457 187156 DEBUG oslo_concurrency.lockutils [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.457 187156 DEBUG oslo_concurrency.lockutils [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.457 187156 DEBUG nova.network.neutron [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing network info cache for port c31397b6-7050-41a8-be27-e6dce253a00f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:33:41 np0005539504 nova_compute[187152]: 2025-11-29 07:33:41.999 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:42 np0005539504 nova_compute[187152]: 2025-11-29 07:33:42.469 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:42 np0005539504 podman[242090]: 2025-11-29 07:33:42.739374889 +0000 UTC m=+0.076408926 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:33:42 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:33:42.923 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:43 np0005539504 nova_compute[187152]: 2025-11-29 07:33:43.163 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:43 np0005539504 nova_compute[187152]: 2025-11-29 07:33:43.235 187156 DEBUG nova.network.neutron [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updated VIF entry in instance network info cache for port c31397b6-7050-41a8-be27-e6dce253a00f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:33:43 np0005539504 nova_compute[187152]: 2025-11-29 07:33:43.236 187156 DEBUG nova.network.neutron [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:43 np0005539504 nova_compute[187152]: 2025-11-29 07:33:43.266 187156 DEBUG oslo_concurrency.lockutils [req-96e4351c-5dd1-4500-9474-398402ce8409 req-cecf2081-8bb5-48b9-b000-e8cc2cdc19b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:45 np0005539504 nova_compute[187152]: 2025-11-29 07:33:45.780 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:47 np0005539504 nova_compute[187152]: 2025-11-29 07:33:47.002 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:48Z|00584|memory|INFO|peak resident set size grew 55% in last 3736.9 seconds, from 16128 kB to 25000 kB
Nov 29 02:33:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:48Z|00585|memory|INFO|idl-cells-OVN_Southbound:12045 idl-cells-Open_vSwitch:813 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:439 lflow-cache-entries-cache-matches:316 lflow-cache-size-KB:1889 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:762 ofctrl_installed_flow_usage-KB:556 ofctrl_sb_flow_ref_usage-KB:285
Nov 29 02:33:48 np0005539504 podman[242112]: 2025-11-29 07:33:48.73789972 +0000 UTC m=+0.059344722 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:33:50 np0005539504 nova_compute[187152]: 2025-11-29 07:33:50.785 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:51Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:7b:df 10.100.0.3
Nov 29 02:33:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:33:51Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:7b:df 10.100.0.3
Nov 29 02:33:52 np0005539504 nova_compute[187152]: 2025-11-29 07:33:52.022 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:55 np0005539504 nova_compute[187152]: 2025-11-29 07:33:55.788 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:57 np0005539504 nova_compute[187152]: 2025-11-29 07:33:57.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:58 np0005539504 nova_compute[187152]: 2025-11-29 07:33:58.462 187156 INFO nova.compute.manager [None req-f32cbf69-9442-46b5-a5bd-cafddf797dc5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Get console output#033[00m
Nov 29 02:33:58 np0005539504 nova_compute[187152]: 2025-11-29 07:33:58.470 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:34:00 np0005539504 nova_compute[187152]: 2025-11-29 07:34:00.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:00 np0005539504 ovn_controller[95182]: 2025-11-29T07:34:00Z|00586|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:34:00 np0005539504 nova_compute[187152]: 2025-11-29 07:34:00.985 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:01 np0005539504 podman[242146]: 2025-11-29 07:34:01.730333502 +0000 UTC m=+0.063837505 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:34:01 np0005539504 podman[242147]: 2025-11-29 07:34:01.733413795 +0000 UTC m=+0.068798949 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:34:01 np0005539504 podman[242148]: 2025-11-29 07:34:01.753537871 +0000 UTC m=+0.086983183 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:34:02 np0005539504 nova_compute[187152]: 2025-11-29 07:34:02.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:05 np0005539504 podman[242208]: 2025-11-29 07:34:05.702483365 +0000 UTC m=+0.049195468 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:34:05 np0005539504 podman[242209]: 2025-11-29 07:34:05.733412215 +0000 UTC m=+0.075456100 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:34:05 np0005539504 nova_compute[187152]: 2025-11-29 07:34:05.795 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:05 np0005539504 nova_compute[187152]: 2025-11-29 07:34:05.962 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:07 np0005539504 nova_compute[187152]: 2025-11-29 07:34:07.032 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:34:09Z|00587|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:34:09 np0005539504 nova_compute[187152]: 2025-11-29 07:34:09.311 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:09 np0005539504 nova_compute[187152]: 2025-11-29 07:34:09.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:10 np0005539504 nova_compute[187152]: 2025-11-29 07:34:10.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:12 np0005539504 nova_compute[187152]: 2025-11-29 07:34:12.034 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:13 np0005539504 podman[242257]: 2025-11-29 07:34:13.764645205 +0000 UTC m=+0.090281373 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 02:34:13 np0005539504 nova_compute[187152]: 2025-11-29 07:34:13.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:15 np0005539504 nova_compute[187152]: 2025-11-29 07:34:15.803 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:17 np0005539504 nova_compute[187152]: 2025-11-29 07:34:17.037 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:17 np0005539504 nova_compute[187152]: 2025-11-29 07:34:17.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:17 np0005539504 nova_compute[187152]: 2025-11-29 07:34:17.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:34:19 np0005539504 podman[242279]: 2025-11-29 07:34:19.757914022 +0000 UTC m=+0.092722129 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 02:34:19 np0005539504 nova_compute[187152]: 2025-11-29 07:34:19.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:34:20.225 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:34:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:34:20.226 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:34:20 np0005539504 nova_compute[187152]: 2025-11-29 07:34:20.227 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:20 np0005539504 nova_compute[187152]: 2025-11-29 07:34:20.805 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:34:21.229 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:34:22 np0005539504 nova_compute[187152]: 2025-11-29 07:34:22.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:34:22.980 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:34:22.980 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:34:22.981 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:34:23 np0005539504 nova_compute[187152]: 2025-11-29 07:34:23.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:24 np0005539504 nova_compute[187152]: 2025-11-29 07:34:24.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:24 np0005539504 nova_compute[187152]: 2025-11-29 07:34:24.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:34:24 np0005539504 nova_compute[187152]: 2025-11-29 07:34:24.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:34:25 np0005539504 nova_compute[187152]: 2025-11-29 07:34:25.809 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:27 np0005539504 nova_compute[187152]: 2025-11-29 07:34:27.042 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:30 np0005539504 nova_compute[187152]: 2025-11-29 07:34:30.812 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:32 np0005539504 nova_compute[187152]: 2025-11-29 07:34:32.044 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:32 np0005539504 podman[242302]: 2025-11-29 07:34:32.7575065 +0000 UTC m=+0.086343916 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:34:32 np0005539504 podman[242304]: 2025-11-29 07:34:32.770887973 +0000 UTC m=+0.080905358 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 02:34:32 np0005539504 podman[242303]: 2025-11-29 07:34:32.815987608 +0000 UTC m=+0.126150737 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Nov 29 02:34:35 np0005539504 nova_compute[187152]: 2025-11-29 07:34:35.817 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:36 np0005539504 podman[242371]: 2025-11-29 07:34:36.745594538 +0000 UTC m=+0.080122376 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:34:36 np0005539504 podman[242372]: 2025-11-29 07:34:36.817309545 +0000 UTC m=+0.150999131 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:34:37 np0005539504 nova_compute[187152]: 2025-11-29 07:34:37.047 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:40 np0005539504 nova_compute[187152]: 2025-11-29 07:34:40.863 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:41 np0005539504 nova_compute[187152]: 2025-11-29 07:34:41.871 187156 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.91 sec#033[00m
Nov 29 02:34:42 np0005539504 nova_compute[187152]: 2025-11-29 07:34:42.049 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:44 np0005539504 nova_compute[187152]: 2025-11-29 07:34:44.599 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:34:44 np0005539504 nova_compute[187152]: 2025-11-29 07:34:44.600 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:34:44 np0005539504 nova_compute[187152]: 2025-11-29 07:34:44.600 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:34:44 np0005539504 nova_compute[187152]: 2025-11-29 07:34:44.601 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:34:44 np0005539504 podman[242418]: 2025-11-29 07:34:44.728240468 +0000 UTC m=+0.062619742 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:34:45 np0005539504 nova_compute[187152]: 2025-11-29 07:34:45.867 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:47 np0005539504 nova_compute[187152]: 2025-11-29 07:34:47.052 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.984 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'name': 'tempest-TestNetworkBasicOps-server-1269463687', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000094', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.985 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.985 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>]
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.986 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.986 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>]
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.993 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 949386c1-7dd6-4ddb-89b1-4762db3984dd / tapc31397b6-70 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.994 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c32c2af-74dd-422e-bdd8-8c148eedda43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:47.986694', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e20b96ee-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': 'c28c61d5430a56ca925f65086844585f1a03d8a689ccb2371e1ba9812759493f'}]}, 'timestamp': '2025-11-29 07:34:47.995415', '_unique_id': 'b7bfa89009434c94916289169b965ac2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.997 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:47.999 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c1560b7-a013-44f1-b73e-4f02c115310f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:47.999752', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e20c5bc4-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': '8ad042982ce8902852bd86a6970c31e8419b5cbaa8edaba0326f6ff8b5dde6b2'}]}, 'timestamp': '2025-11-29 07:34:48.000200', '_unique_id': '4d5a9224033c4952be866a04b936a073'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.001 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.002 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.002 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>]
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.002 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.incoming.bytes volume: 6508 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'effa5c36-d828-42e0-83ae-1926d19fb077', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6508, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.002661', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e20cca64-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': '9022e1c2bd232591c542205416f8486dc5ec18fd5c2573550d959800297de56f'}]}, 'timestamp': '2025-11-29 07:34:48.003008', '_unique_id': 'e9c22a0181244f118c4b07682bc64c6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.003 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.036 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.read.requests volume: 1086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.037 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aa3262a-d7d5-4046-82a5-4a8018586797', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1086, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.004617', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e211febc-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '246ab9b9b915dad6ed0b79fa44dbe25ca6fc2e1e1748da98824d218addd387f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.004617', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2120d6c-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '5e069a9e2317710dec21f8880761d73f90f2d7b2fb86a068284a749fa2e3bbbb'}]}, 'timestamp': '2025-11-29 07:34:48.037489', '_unique_id': '80ba6e685e4c4b6e8e6a5f7082c4722f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.039 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.write.requests volume: 322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.039 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8456072-3f72-45ca-b40e-6a5fb61cb21a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 322, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.039541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2126b68-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '41706fed9dde820a6d0dfcb9c6bfb3f8342d97f63bb13f9e0249d4192f57f1ff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.039541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21276d0-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '2af6e16fe883776d69275656f0ab20befeea2de7e37b4ba0a4f8b093976f1026'}]}, 'timestamp': '2025-11-29 07:34:48.040115', '_unique_id': '8195893184d643218c0ed9eae26e6432'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.055 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.055 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e60bf16-ea40-4da9-a5a9-31e6efa22200', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.041743', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e214dc0e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.976451699, 'message_signature': '12adc56b321c6c1f16ac385270ca9dae407dce2578eaff3f2096627728ef8364'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.041743', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e214eadc-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.976451699, 'message_signature': 'd1c7e357c6d48db031da009185a7cad68de1637243d0d715aacfcba456e417a0'}]}, 'timestamp': '2025-11-29 07:34:48.056356', '_unique_id': 'f079d0849384442d82b89de587624e4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.057 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.058 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.write.latency volume: 11251756348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.059 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90364c95-4465-4ac9-b620-f5dd9c35cb2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11251756348, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.058850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2155e72-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '7d0f151cc29570b8e000e954596b7eb70a061bbd9d737edf8dc895e7bfa0a55b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.058850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2156a48-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '28dbb69c05748c7db640554421c58a2ee68f39650254ef9593250a97f5a3ae2e'}]}, 'timestamp': '2025-11-29 07:34:48.059480', '_unique_id': '5c241fdd25bd4ae3a331fd7e7da1a5db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.061 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.incoming.packets volume: 38 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5524dfb6-63f5-4106-95fe-453e7f14f9fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 38, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.061944', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e215d6cc-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': 'b85037e0ab412fec70f55382705300f1a69be072098cba8e342258d92b75593c'}]}, 'timestamp': '2025-11-29 07:34:48.062302', '_unique_id': '1cf538ad3c094d1dbf96f8a1d6082fed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.064 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.064 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e81327a5-d00c-4215-b6a7-56a35cc7771c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.064230', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e21633ce-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': '410a6ac0d82974fba71dcaead55c223a5bb1b85872e9b9056a41b81aa1c3b07c'}]}, 'timestamp': '2025-11-29 07:34:48.064748', '_unique_id': '137619080b264a4594619917ad21eb7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.066 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.066 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529957bf-8580-4c90-9bc4-ef741399c880', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.066488', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2168a36-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.976451699, 'message_signature': 'f316e10a9a0a89515f8acede8616214e266aaa9376cd4983a46c4218c382d678'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.066488', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e216965c-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.976451699, 'message_signature': '6e033ef762a70453542b05236b6191f757cb76ab0eb3139b8887876307f5636b'}]}, 'timestamp': '2025-11-29 07:34:48.067170', '_unique_id': 'da593c927dd642f199e77ad57265442b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.069 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '095dddc2-7fa2-41e9-94cc-74f8ca485199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.069122', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e216ef08-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': '1f423c8516d9773b0d54f5f27b0ab04f33ce0ea7b2d2074b3b663dec9ff5a37f'}]}, 'timestamp': '2025-11-29 07:34:48.069497', '_unique_id': 'aaa82bb2ffa146278c77c0e022760256'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.071 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.094 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/memory.usage volume: 42.71484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '140c694b-e32a-42ca-8605-e5a9cb2d91ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.71484375, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'timestamp': '2025-11-29T07:34:48.071252', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e21ae3c4-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7151.029111959, 'message_signature': '42e8f46b7ee4e19e5b3730a9222b6451aefd1328092f01e9077d182241420a84'}]}, 'timestamp': '2025-11-29 07:34:48.095498', '_unique_id': 'cc89cbf96ed14fe8889d5d287089f8fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.097 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.read.latency volume: 220252345 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.098 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.read.latency volume: 29046231 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46691da4-6b2a-4421-aba8-3d7ee77b6bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 220252345, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.097836', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21b5098-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '20c863a7eb50c5df625ba93516d7a2f1675b3bfd7aad45f44c84a25d866de3eb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29046231, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.097836', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21b607e-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '5d331df60ebd161bedcbfd7d5c576fab237ab64ef7e0c28727497c0d173e29e6'}]}, 'timestamp': '2025-11-29 07:34:48.098565', '_unique_id': 'd17a257445594ee4a9d5b11ff5885e07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.100 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.write.bytes volume: 72974336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.100 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a21a228-b2f6-4446-b38d-7f2875922dec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72974336, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.100427', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21bb538-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '5f77cf3145b6da9aab2a7f87532cce96b85cefea54d2765e1ba4de4d109abcd3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.100427', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21bc1cc-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': '5c38155fe72801790b59f1eb7119f872ee83e22324b9f21ec49552d78ccf7ef9'}]}, 'timestamp': '2025-11-29 07:34:48.101044', '_unique_id': '102dce967b2e4eb9a586f375e40ab5fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.102 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.outgoing.bytes volume: 4774 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e2889bd-5fbc-401b-8207-41033e0afd80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4774, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.102773', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e21c1136-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': '631da449c7d262721ab37cbccc89ceafa5dbb0bed5d77ea3916c0a6476f767d2'}]}, 'timestamp': '2025-11-29 07:34:48.103097', '_unique_id': '9e6d1f454a2e4a3c9550593d7083b245'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.104 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d66db79-1b8d-4739-9210-c58bf44032a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.104769', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e21c60a0-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': 'fd505e99561caf91000467b0aa44659e339e4f1a359e0bceb0a3290c3039aab1'}]}, 'timestamp': '2025-11-29 07:34:48.105152', '_unique_id': 'ba1a7acb7ee34482a177f0ba18bb764b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.106 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.107 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98005562-2b3a-43ad-8097-8618d57cfbe9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.106925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21cb366-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.976451699, 'message_signature': '494c06a1b00379cf2d314327f3f6e3c4b57ad2f29c44bc707460aec2e1f6913a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.106925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21cbe88-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.976451699, 'message_signature': '6510d4e8adf5344a2bca03db281d4bda9da11e99945225dec4aed0c5206ee801'}]}, 'timestamp': '2025-11-29 07:34:48.107599', '_unique_id': 'eaf391f82e444b5e90cab5984378542f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.109 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f807559-1e24-4339-b776-fb72354aa20e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.109270', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e21d107c-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': 'cd1ce2fea3b11959e85741904dfcdcf49f42ee37768952af84bcd215cb9544b7'}]}, 'timestamp': '2025-11-29 07:34:48.109692', '_unique_id': 'd6d48420f670438689771ae33fd0e6ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.111 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.111 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1269463687>]
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.111 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.read.bytes volume: 30321152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.112 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4bb2941-aefc-40cc-a904-3f1e51322aba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30321152, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-vda', 'timestamp': '2025-11-29T07:34:48.111792', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21d7134-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': 'eef559aaf0e6efc88f8ab0e13b1a3903c185b4080885ccb0f00e971f779082ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd-sda', 'timestamp': '2025-11-29T07:34:48.111792', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21d7c6a-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.939319621, 'message_signature': 'fe32884f476cc4705fa635fc6313aebec4b17e7802a14d45e11818bcbda272b1'}]}, 'timestamp': '2025-11-29 07:34:48.112402', '_unique_id': 'fa6ddbedbcd24897afbefb0d7f8bae10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.114 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bbd483b-f33a-40a9-9a35-0a0d21e59fae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-00000094-949386c1-7dd6-4ddb-89b1-4762db3984dd-tapc31397b6-70', 'timestamp': '2025-11-29T07:34:48.114091', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'tapc31397b6-70', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:7b:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc31397b6-70'}, 'message_id': 'e21dcac6-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7150.921380753, 'message_signature': '434a430e26054005b8b6b6ddf500d5f759b545c7f43af1d5123bdd3cf516b274'}]}, 'timestamp': '2025-11-29 07:34:48.114420', '_unique_id': 'b0ffe5d852e14cbd94054c618297c434'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.116 12 DEBUG ceilometer.compute.pollsters [-] 949386c1-7dd6-4ddb-89b1-4762db3984dd/cpu volume: 12610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac05864e-df26-4467-a9e7-dad71e50613e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12610000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'timestamp': '2025-11-29T07:34:48.116083', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1269463687', 'name': 'instance-00000094', 'instance_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e21e1904-ccf5-11f0-8a11-fa163ea726b4', 'monotonic_time': 7151.029111959, 'message_signature': '2e4a8e98748371e364940378b10dc63ccf7ab43d9fa3ecd8262a9d326df65945'}]}, 'timestamp': '2025-11-29 07:34:48.116427', '_unique_id': '6e95ae91d008445d9d9b136cfe93d1c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:34:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.698 187156 DEBUG oslo_concurrency.lockutils [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "interface-949386c1-7dd6-4ddb-89b1-4762db3984dd-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.698 187156 DEBUG oslo_concurrency.lockutils [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-949386c1-7dd6-4ddb-89b1-4762db3984dd-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.699 187156 DEBUG nova.objects.instance [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'flavor' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.717 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.751 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.752 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.752 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.753 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.789 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.790 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.790 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.790 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.888 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.965 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:34:48 np0005539504 nova_compute[187152]: 2025-11-29 07:34:48.966 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:34:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:34:48Z|00588|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.025 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.070 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.196 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.197 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5575MB free_disk=73.04608917236328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.198 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.198 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.397 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 949386c1-7dd6-4ddb-89b1-4762db3984dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.398 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.398 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.631 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.648 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.699 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:34:49 np0005539504 nova_compute[187152]: 2025-11-29 07:34:49.699 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:34:50 np0005539504 nova_compute[187152]: 2025-11-29 07:34:50.051 187156 DEBUG nova.objects.instance [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_requests' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:34:50 np0005539504 nova_compute[187152]: 2025-11-29 07:34:50.299 187156 DEBUG nova.network.neutron [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:34:50 np0005539504 podman[242445]: 2025-11-29 07:34:50.726251716 +0000 UTC m=+0.066658311 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 02:34:50 np0005539504 nova_compute[187152]: 2025-11-29 07:34:50.872 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:52 np0005539504 nova_compute[187152]: 2025-11-29 07:34:52.056 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:52 np0005539504 nova_compute[187152]: 2025-11-29 07:34:52.161 187156 DEBUG nova.policy [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:34:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:34:53Z|00589|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:34:53 np0005539504 nova_compute[187152]: 2025-11-29 07:34:53.647 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:54 np0005539504 nova_compute[187152]: 2025-11-29 07:34:54.627 187156 DEBUG nova.network.neutron [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Successfully created port: 72a72180-9b12-4a7f-9919-74bcdf0427b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:34:55 np0005539504 nova_compute[187152]: 2025-11-29 07:34:55.882 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:57 np0005539504 nova_compute[187152]: 2025-11-29 07:34:57.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:57 np0005539504 nova_compute[187152]: 2025-11-29 07:34:57.695 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.206 187156 DEBUG nova.network.neutron [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Successfully updated port: 72a72180-9b12-4a7f-9919-74bcdf0427b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.367 187156 DEBUG oslo_concurrency.lockutils [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.367 187156 DEBUG oslo_concurrency.lockutils [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.368 187156 DEBUG nova.network.neutron [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.752 187156 DEBUG nova.compute.manager [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-changed-72a72180-9b12-4a7f-9919-74bcdf0427b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.752 187156 DEBUG nova.compute.manager [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing instance network info cache due to event network-changed-72a72180-9b12-4a7f-9919-74bcdf0427b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.753 187156 DEBUG oslo_concurrency.lockutils [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:00 np0005539504 nova_compute[187152]: 2025-11-29 07:35:00.884 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:02 np0005539504 nova_compute[187152]: 2025-11-29 07:35:02.063 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:03 np0005539504 podman[242481]: 2025-11-29 07:35:03.729600293 +0000 UTC m=+0.064196944 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:35:03 np0005539504 podman[242483]: 2025-11-29 07:35:03.737592111 +0000 UTC m=+0.063293890 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:35:03 np0005539504 podman[242482]: 2025-11-29 07:35:03.775366456 +0000 UTC m=+0.105459834 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 29 02:35:05 np0005539504 nova_compute[187152]: 2025-11-29 07:35:05.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:06 np0005539504 nova_compute[187152]: 2025-11-29 07:35:06.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.065 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.560 187156 DEBUG nova.network.neutron [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.602 187156 DEBUG oslo_concurrency.lockutils [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.603 187156 DEBUG oslo_concurrency.lockutils [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.603 187156 DEBUG nova.network.neutron [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing network info cache for port 72a72180-9b12-4a7f-9919-74bcdf0427b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.607 187156 DEBUG nova.virt.libvirt.vif [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.608 187156 DEBUG nova.network.os_vif_util [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.608 187156 DEBUG nova.network.os_vif_util [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.609 187156 DEBUG os_vif [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.609 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.610 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.610 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.613 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.614 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72a72180-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.614 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap72a72180-9b, col_values=(('external_ids', {'iface-id': '72a72180-9b12-4a7f-9919-74bcdf0427b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:d0:78', 'vm-uuid': '949386c1-7dd6-4ddb-89b1-4762db3984dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.616 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 NetworkManager[55210]: <info>  [1764401707.6183] manager: (tap72a72180-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.623 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.625 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.627 187156 INFO os_vif [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b')#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.627 187156 DEBUG nova.virt.libvirt.vif [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.628 187156 DEBUG nova.network.os_vif_util [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.628 187156 DEBUG nova.network.os_vif_util [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.631 187156 DEBUG nova.virt.libvirt.guest [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:ef:d0:78"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <target dev="tap72a72180-9b"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]: </interface>
Nov 29 02:35:07 np0005539504 nova_compute[187152]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:35:07 np0005539504 kernel: tap72a72180-9b: entered promiscuous mode
Nov 29 02:35:07 np0005539504 NetworkManager[55210]: <info>  [1764401707.6436] manager: (tap72a72180-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:07Z|00590|binding|INFO|Claiming lport 72a72180-9b12-4a7f-9919-74bcdf0427b7 for this chassis.
Nov 29 02:35:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:07Z|00591|binding|INFO|72a72180-9b12-4a7f-9919-74bcdf0427b7: Claiming fa:16:3e:ef:d0:78 10.100.0.42
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 systemd-udevd[242569]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.691 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:07Z|00592|binding|INFO|Setting lport 72a72180-9b12-4a7f-9919-74bcdf0427b7 ovn-installed in OVS
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.698 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539504 NetworkManager[55210]: <info>  [1764401707.7053] device (tap72a72180-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:35:07 np0005539504 NetworkManager[55210]: <info>  [1764401707.7065] device (tap72a72180-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:35:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:07Z|00593|binding|INFO|Setting lport 72a72180-9b12-4a7f-9919-74bcdf0427b7 up in Southbound
Nov 29 02:35:07 np0005539504 podman[242546]: 2025-11-29 07:35:07.7224225 +0000 UTC m=+0.064176524 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.724 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:d0:78 10.100.0.42'], port_security=['fa:16:3e:ef:d0:78 10.100.0.42'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.42/28', 'neutron:device_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9eacd38c-2429-49d2-8f36-a26cde18927c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f48ae9f-8a38-4b48-bc21-79e7d8b72694, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=72a72180-9b12-4a7f-9919-74bcdf0427b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.726 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 72a72180-9b12-4a7f-9919-74bcdf0427b7 in datapath 9eacd38c-2429-49d2-8f36-a26cde18927c bound to our chassis#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.728 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9eacd38c-2429-49d2-8f36-a26cde18927c#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.742 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[81856b6b-634f-4bc0-bf1f-63393a845c35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.743 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9eacd38c-21 in ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.746 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9eacd38c-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.746 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[87028b95-1d18-4706-8b24-35b7455a1c65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.747 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a569517d-d001-4bcf-bacc-704d3e8d8ecb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 podman[242550]: 2025-11-29 07:35:07.758151249 +0000 UTC m=+0.091881905 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.761 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[81b535cd-8b04-42a0-aabc-537c0e266162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.777 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f953bb2e-5190-4c49-a54e-9b0a71b9cad5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.816 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2d3d3e-a7af-4df0-aeae-4bfd1af9a6e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.823 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4e413c98-9ab8-47a4-abfb-3f656b512c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 NetworkManager[55210]: <info>  [1764401707.8247] manager: (tap9eacd38c-20): new Veth device (/org/freedesktop/NetworkManager/Devices/266)
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.859 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5532df55-3c6a-42d1-ad02-fd7264468c2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.863 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd89cc1-0e46-4d5d-a5bc-f3b360f7df37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.879 187156 DEBUG nova.virt.libvirt.driver [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.880 187156 DEBUG nova.virt.libvirt.driver [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.880 187156 DEBUG nova.virt.libvirt.driver [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:37:7b:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.880 187156 DEBUG nova.virt.libvirt.driver [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:ef:d0:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:35:07 np0005539504 NetworkManager[55210]: <info>  [1764401707.8909] device (tap9eacd38c-20): carrier: link connected
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.897 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[83869d9d-0f21-462b-80c6-f00da28feafc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.916 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fa66621f-83a6-4d6e-8168-bdd19458dd88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9eacd38c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:5c:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 717077, 'reachable_time': 22379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242625, 'error': None, 'target': 'ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.927 187156 DEBUG nova.virt.libvirt.guest [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:07</nova:creationTime>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    <nova:port uuid="72a72180-9b12-4a7f-9919-74bcdf0427b7">
Nov 29 02:35:07 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.42" ipVersion="4"/>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:07 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:07 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:07 np0005539504 nova_compute[187152]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.932 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b2447199-c817-4b33-8a68-1d71397446d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:5cdb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 717077, 'tstamp': 717077}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242626, 'error': None, 'target': 'ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.951 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e2da44a1-5143-40b6-94d1-7dbb03a55620]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9eacd38c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:5c:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 717077, 'reachable_time': 22379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242627, 'error': None, 'target': 'ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:07 np0005539504 nova_compute[187152]: 2025-11-29 07:35:07.971 187156 DEBUG oslo_concurrency.lockutils [None req-1de62b77-4c95-4c13-b31b-23629d984d81 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-949386c1-7dd6-4ddb-89b1-4762db3984dd-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 19.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:07.989 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6d858324-2a91-4702-9513-23000ab6b312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.056 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6930dbfa-b8a1-4bc0-aef4-0e9a0ae803c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.058 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9eacd38c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.059 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.059 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9eacd38c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:08 np0005539504 kernel: tap9eacd38c-20: entered promiscuous mode
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.106 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:08 np0005539504 NetworkManager[55210]: <info>  [1764401708.1107] manager: (tap9eacd38c-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.110 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.112 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9eacd38c-20, col_values=(('external_ids', {'iface-id': '36d494ba-72bf-4e54-83c3-5c89828b47a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.114 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:08 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:08Z|00594|binding|INFO|Releasing lport 36d494ba-72bf-4e54-83c3-5c89828b47a5 from this chassis (sb_readonly=0)
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.115 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9eacd38c-2429-49d2-8f36-a26cde18927c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9eacd38c-2429-49d2-8f36-a26cde18927c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.116 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a6006447-8669-4011-9cce-4075d4f42840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.118 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-9eacd38c-2429-49d2-8f36-a26cde18927c
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/9eacd38c-2429-49d2-8f36-a26cde18927c.pid.haproxy
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 9eacd38c-2429-49d2-8f36-a26cde18927c
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:35:08 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:08.118 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c', 'env', 'PROCESS_TAG=haproxy-9eacd38c-2429-49d2-8f36-a26cde18927c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9eacd38c-2429-49d2-8f36-a26cde18927c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.127 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:08 np0005539504 podman[242659]: 2025-11-29 07:35:08.455512726 +0000 UTC m=+0.023714494 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.797 187156 DEBUG nova.compute.manager [req-594ef513-b80e-46f3-88ad-ba579e3a6692 req-4e1ccb9d-b2df-4371-b93c-79d5510b503b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.798 187156 DEBUG oslo_concurrency.lockutils [req-594ef513-b80e-46f3-88ad-ba579e3a6692 req-4e1ccb9d-b2df-4371-b93c-79d5510b503b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.799 187156 DEBUG oslo_concurrency.lockutils [req-594ef513-b80e-46f3-88ad-ba579e3a6692 req-4e1ccb9d-b2df-4371-b93c-79d5510b503b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.799 187156 DEBUG oslo_concurrency.lockutils [req-594ef513-b80e-46f3-88ad-ba579e3a6692 req-4e1ccb9d-b2df-4371-b93c-79d5510b503b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.800 187156 DEBUG nova.compute.manager [req-594ef513-b80e-46f3-88ad-ba579e3a6692 req-4e1ccb9d-b2df-4371-b93c-79d5510b503b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:08 np0005539504 nova_compute[187152]: 2025-11-29 07:35:08.800 187156 WARNING nova.compute.manager [req-594ef513-b80e-46f3-88ad-ba579e3a6692 req-4e1ccb9d-b2df-4371-b93c-79d5510b503b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received unexpected event network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:35:08 np0005539504 podman[242659]: 2025-11-29 07:35:08.884347572 +0000 UTC m=+0.452549320 container create 5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:35:08 np0005539504 systemd[1]: Started libpod-conmon-5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed.scope.
Nov 29 02:35:08 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:35:08 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c4f4b3e70cdfebe1ea9925a082e3fc0c27ed067f20238d80e152ed7141ca21d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:35:08 np0005539504 podman[242659]: 2025-11-29 07:35:08.991709867 +0000 UTC m=+0.559911645 container init 5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:35:08 np0005539504 podman[242659]: 2025-11-29 07:35:08.998147422 +0000 UTC m=+0.566349170 container start 5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:35:09 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [NOTICE]   (242678) : New worker (242680) forked
Nov 29 02:35:09 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [NOTICE]   (242678) : Loading success.
Nov 29 02:35:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:09Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:d0:78 10.100.0.42
Nov 29 02:35:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:09Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:d0:78 10.100.0.42
Nov 29 02:35:09 np0005539504 nova_compute[187152]: 2025-11-29 07:35:09.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.006 187156 DEBUG nova.compute.manager [req-3623ef27-d0a0-456b-bfde-9145195b404a req-8b0a416c-0178-4f47-a34f-3f780c60fe11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.007 187156 DEBUG oslo_concurrency.lockutils [req-3623ef27-d0a0-456b-bfde-9145195b404a req-8b0a416c-0178-4f47-a34f-3f780c60fe11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.007 187156 DEBUG oslo_concurrency.lockutils [req-3623ef27-d0a0-456b-bfde-9145195b404a req-8b0a416c-0178-4f47-a34f-3f780c60fe11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.008 187156 DEBUG oslo_concurrency.lockutils [req-3623ef27-d0a0-456b-bfde-9145195b404a req-8b0a416c-0178-4f47-a34f-3f780c60fe11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.008 187156 DEBUG nova.compute.manager [req-3623ef27-d0a0-456b-bfde-9145195b404a req-8b0a416c-0178-4f47-a34f-3f780c60fe11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.009 187156 WARNING nova.compute.manager [req-3623ef27-d0a0-456b-bfde-9145195b404a req-8b0a416c-0178-4f47-a34f-3f780c60fe11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received unexpected event network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.062 187156 DEBUG oslo_concurrency.lockutils [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "interface-949386c1-7dd6-4ddb-89b1-4762db3984dd-72a72180-9b12-4a7f-9919-74bcdf0427b7" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.062 187156 DEBUG oslo_concurrency.lockutils [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-949386c1-7dd6-4ddb-89b1-4762db3984dd-72a72180-9b12-4a7f-9919-74bcdf0427b7" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.079 187156 DEBUG nova.objects.instance [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'flavor' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.138 187156 DEBUG nova.virt.libvirt.vif [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.138 187156 DEBUG nova.network.os_vif_util [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.139 187156 DEBUG nova.network.os_vif_util [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.142 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.144 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.148 187156 DEBUG nova.virt.libvirt.driver [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Attempting to detach device tap72a72180-9b from instance 949386c1-7dd6-4ddb-89b1-4762db3984dd from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.148 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:ef:d0:78"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <target dev="tap72a72180-9b"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </interface>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.154 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.158 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface>not found in domain: <domain type='kvm' id='77'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <name>instance-00000094</name>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <uuid>949386c1-7dd6-4ddb-89b1-4762db3984dd</uuid>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:07</nova:creationTime>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:port uuid="72a72180-9b12-4a7f-9919-74bcdf0427b7">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.42" ipVersion="4"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <memory unit='KiB'>131072</memory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <resource>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <partition>/machine</partition>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </resource>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <sysinfo type='smbios'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='serial'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='uuid'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <boot dev='hd'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <smbios mode='sysinfo'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <vmcoreinfo state='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <feature policy='require' name='x2apic'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <feature policy='require' name='vme'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <clock offset='utc'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <timer name='hpet' present='no'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <on_reboot>restart</on_reboot>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <on_crash>destroy</on_crash>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <disk type='file' device='disk'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk' index='2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <backingStore type='file' index='3'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <format type='raw'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <backingStore/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      </backingStore>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='vda' bus='virtio'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='virtio-disk0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <disk type='file' device='cdrom'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config' index='1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <backingStore/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='sda' bus='sata'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <readonly/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='sata0-0-0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pcie.0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='1' port='0x10'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='2' port='0x11'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='3' port='0x12'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='4' port='0x13'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='5' port='0x14'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='6' port='0x15'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='7' port='0x16'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='8' port='0x17'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.8'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='9' port='0x18'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.9'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='10' port='0x19'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.10'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='11' port='0x1a'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.11'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='12' port='0x1b'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.12'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='13' port='0x1c'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.13'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='14' port='0x1d'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.14'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='15' port='0x1e'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.15'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='16' port='0x1f'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.16'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='17' port='0x20'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.17'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='18' port='0x21'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.18'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='19' port='0x22'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.19'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='20' port='0x23'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.20'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='21' port='0x24'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.21'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='22' port='0x25'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.22'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='23' port='0x26'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.23'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='24' port='0x27'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.24'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='25' port='0x28'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.25'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-pci-bridge'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.26'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='usb'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='sata' index='0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='ide'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:37:7b:df'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='tapc31397b6-70'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='net0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:ef:d0:78'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='tap72a72180-9b'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='net1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <serial type='pty'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target type='isa-serial' port='0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <model name='isa-serial'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      </target>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target type='serial' port='0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </console>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <input type='tablet' bus='usb'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='input0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <input type='mouse' bus='ps2'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='input1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <input type='keyboard' bus='ps2'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='input2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <listen type='address' address='::0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <audio id='1' type='none'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='video0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <watchdog model='itco' action='reset'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='watchdog0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </watchdog>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <memballoon model='virtio'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <stats period='10'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='balloon0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <rng model='virtio'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='rng0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <label>system_u:system_r:svirt_t:s0:c213,c858</label>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c213,c858</imagelabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <label>+107:+107</label>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.159 187156 INFO nova.virt.libvirt.driver [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully detached device tap72a72180-9b from instance 949386c1-7dd6-4ddb-89b1-4762db3984dd from the persistent domain config.#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.159 187156 DEBUG nova.virt.libvirt.driver [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] (1/8): Attempting to detach device tap72a72180-9b with device alias net1 from instance 949386c1-7dd6-4ddb-89b1-4762db3984dd from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.159 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <mac address="fa:16:3e:ef:d0:78"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <model type="virtio"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <mtu size="1442"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <target dev="tap72a72180-9b"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </interface>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.191 187156 DEBUG nova.network.neutron [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updated VIF entry in instance network info cache for port 72a72180-9b12-4a7f-9919-74bcdf0427b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.192 187156 DEBUG nova.network.neutron [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.220 187156 DEBUG oslo_concurrency.lockutils [req-27b9e443-4d6d-4716-93a3-48f90938d015 req-76708de2-9f0b-4873-99f4-57323ebb553d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:11 np0005539504 kernel: tap72a72180-9b (unregistering): left promiscuous mode
Nov 29 02:35:11 np0005539504 NetworkManager[55210]: <info>  [1764401711.2669] device (tap72a72180-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.277 187156 DEBUG nova.virt.libvirt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Received event <DeviceRemovedEvent: 1764401711.2768817, 949386c1-7dd6-4ddb-89b1-4762db3984dd => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.279 187156 DEBUG nova.virt.libvirt.driver [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Start waiting for the detach event from libvirt for device tap72a72180-9b with device alias net1 for instance 949386c1-7dd6-4ddb-89b1-4762db3984dd _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.280 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.280 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:11Z|00595|binding|INFO|Releasing lport 72a72180-9b12-4a7f-9919-74bcdf0427b7 from this chassis (sb_readonly=0)
Nov 29 02:35:11 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:11Z|00596|binding|INFO|Setting lport 72a72180-9b12-4a7f-9919-74bcdf0427b7 down in Southbound
Nov 29 02:35:11 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:11Z|00597|binding|INFO|Removing iface tap72a72180-9b ovn-installed in OVS
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.284 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.285 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface>not found in domain: <domain type='kvm' id='77'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <name>instance-00000094</name>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <uuid>949386c1-7dd6-4ddb-89b1-4762db3984dd</uuid>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:07</nova:creationTime>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:port uuid="72a72180-9b12-4a7f-9919-74bcdf0427b7">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.42" ipVersion="4"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <memory unit='KiB'>131072</memory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <resource>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <partition>/machine</partition>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </resource>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <sysinfo type='smbios'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='serial'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='uuid'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <boot dev='hd'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <smbios mode='sysinfo'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <vmcoreinfo state='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <feature policy='require' name='x2apic'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <feature policy='require' name='vme'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <clock offset='utc'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <timer name='hpet' present='no'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <on_reboot>restart</on_reboot>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <on_crash>destroy</on_crash>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <disk type='file' device='disk'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk' index='2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <backingStore type='file' index='3'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <format type='raw'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <backingStore/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      </backingStore>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='vda' bus='virtio'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='virtio-disk0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <disk type='file' device='cdrom'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config' index='1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <backingStore/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='sda' bus='sata'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <readonly/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='sata0-0-0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pcie.0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='1' port='0x10'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='2' port='0x11'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='3' port='0x12'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='4' port='0x13'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='5' port='0x14'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='6' port='0x15'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='7' port='0x16'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='8' port='0x17'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.8'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='9' port='0x18'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.9'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='10' port='0x19'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.10'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='11' port='0x1a'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.11'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='12' port='0x1b'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.12'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='13' port='0x1c'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.13'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='14' port='0x1d'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.14'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='15' port='0x1e'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.15'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='16' port='0x1f'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.16'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='17' port='0x20'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.17'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='18' port='0x21'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.18'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='19' port='0x22'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.19'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='20' port='0x23'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.20'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='21' port='0x24'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.21'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='22' port='0x25'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.22'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='23' port='0x26'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.23'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='24' port='0x27'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.24'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target chassis='25' port='0x28'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.25'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model name='pcie-pci-bridge'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='pci.26'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='usb'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <controller type='sata' index='0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='ide'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:37:7b:df'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target dev='tapc31397b6-70'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='net0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <serial type='pty'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target type='isa-serial' port='0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:        <model name='isa-serial'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      </target>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <target type='serial' port='0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </console>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <input type='tablet' bus='usb'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='input0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <input type='mouse' bus='ps2'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='input1'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <input type='keyboard' bus='ps2'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='input2'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <listen type='address' address='::0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <audio id='1' type='none'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='video0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <watchdog model='itco' action='reset'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='watchdog0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </watchdog>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <memballoon model='virtio'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <stats period='10'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='balloon0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <rng model='virtio'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <alias name='rng0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <label>system_u:system_r:svirt_t:s0:c213,c858</label>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c213,c858</imagelabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <label>+107:+107</label>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.285 187156 INFO nova.virt.libvirt.driver [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully detached device tap72a72180-9b from instance 949386c1-7dd6-4ddb-89b1-4762db3984dd from the live domain config.#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.286 187156 DEBUG nova.virt.libvirt.vif [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.286 187156 DEBUG nova.network.os_vif_util [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.287 187156 DEBUG nova.network.os_vif_util [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.287 187156 DEBUG os_vif [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.289 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72a72180-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.290 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.292 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:35:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:11.294 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:d0:78 10.100.0.42'], port_security=['fa:16:3e:ef:d0:78 10.100.0.42'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.42/28', 'neutron:device_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9eacd38c-2429-49d2-8f36-a26cde18927c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f48ae9f-8a38-4b48-bc21-79e7d8b72694, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=72a72180-9b12-4a7f-9919-74bcdf0427b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:11.295 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 72a72180-9b12-4a7f-9919-74bcdf0427b7 in datapath 9eacd38c-2429-49d2-8f36-a26cde18927c unbound from our chassis#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.297 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:11.297 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9eacd38c-2429-49d2-8f36-a26cde18927c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.298 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.301 187156 INFO os_vif [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b')#033[00m
Nov 29 02:35:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:11.298 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3f6ce2-9ac2-4ea4-a62e-004ad0219565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:11 np0005539504 nova_compute[187152]: 2025-11-29 07:35:11.302 187156 DEBUG nova.virt.libvirt.guest [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:11</nova:creationTime>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:11 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:11 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:11 np0005539504 nova_compute[187152]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:35:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:11.302 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c namespace which is not needed anymore#033[00m
Nov 29 02:35:11 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [NOTICE]   (242678) : haproxy version is 2.8.14-c23fe91
Nov 29 02:35:11 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [NOTICE]   (242678) : path to executable is /usr/sbin/haproxy
Nov 29 02:35:11 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [WARNING]  (242678) : Exiting Master process...
Nov 29 02:35:11 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [ALERT]    (242678) : Current worker (242680) exited with code 143 (Terminated)
Nov 29 02:35:11 np0005539504 neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c[242674]: [WARNING]  (242678) : All workers exited. Exiting... (0)
Nov 29 02:35:11 np0005539504 systemd[1]: libpod-5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed.scope: Deactivated successfully.
Nov 29 02:35:11 np0005539504 podman[242713]: 2025-11-29 07:35:11.462924853 +0000 UTC m=+0.054259983 container died 5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:35:12 np0005539504 nova_compute[187152]: 2025-11-29 07:35:12.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:12 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed-userdata-shm.mount: Deactivated successfully.
Nov 29 02:35:12 np0005539504 systemd[1]: var-lib-containers-storage-overlay-8c4f4b3e70cdfebe1ea9925a082e3fc0c27ed067f20238d80e152ed7141ca21d-merged.mount: Deactivated successfully.
Nov 29 02:35:12 np0005539504 podman[242713]: 2025-11-29 07:35:12.955002662 +0000 UTC m=+1.546337792 container cleanup 5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:35:12 np0005539504 systemd[1]: libpod-conmon-5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed.scope: Deactivated successfully.
Nov 29 02:35:13 np0005539504 podman[242744]: 2025-11-29 07:35:13.291264603 +0000 UTC m=+0.304988564 container remove 5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.297 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd55993-291f-4caa-ade1-dd1be8ffb5a8]: (4, ('Sat Nov 29 07:35:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c (5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed)\n5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed\nSat Nov 29 07:35:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c (5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed)\n5eea1a0aca5f4f88a47187c4133553ef611e2233ec28ff89cad3e038c36f99ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.299 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[789f9601-c9c7-4389-b75c-fa2a49b3defd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.301 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9eacd38c-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:13 np0005539504 kernel: tap9eacd38c-20: left promiscuous mode
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.303 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.314 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.315 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.317 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[28c10834-4166-4153-929f-044a7e6aaaa6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.331 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c49c180-071a-422d-a425-38b74cfc0af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.332 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[62340b82-4fed-48e6-892e-352b7d77d21d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.347 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b7537bbe-e4cf-4854-9305-bd99981ed7f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 717069, 'reachable_time': 20387, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242758, 'error': None, 'target': 'ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.349 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9eacd38c-2429-49d2-8f36-a26cde18927c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:35:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:13.350 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[10f94582-deac-4141-87a8-998330627a78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:13 np0005539504 systemd[1]: run-netns-ovnmeta\x2d9eacd38c\x2d2429\x2d49d2\x2d8f36\x2da26cde18927c.mount: Deactivated successfully.
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.946 187156 DEBUG nova.compute.manager [req-f40ee25d-8b71-4cd2-8676-62867c79d57f req-27e70753-f7f7-43ff-a067-72e4a4382286 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-unplugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.946 187156 DEBUG oslo_concurrency.lockutils [req-f40ee25d-8b71-4cd2-8676-62867c79d57f req-27e70753-f7f7-43ff-a067-72e4a4382286 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.946 187156 DEBUG oslo_concurrency.lockutils [req-f40ee25d-8b71-4cd2-8676-62867c79d57f req-27e70753-f7f7-43ff-a067-72e4a4382286 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.946 187156 DEBUG oslo_concurrency.lockutils [req-f40ee25d-8b71-4cd2-8676-62867c79d57f req-27e70753-f7f7-43ff-a067-72e4a4382286 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.946 187156 DEBUG nova.compute.manager [req-f40ee25d-8b71-4cd2-8676-62867c79d57f req-27e70753-f7f7-43ff-a067-72e4a4382286 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-unplugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:13 np0005539504 nova_compute[187152]: 2025-11-29 07:35:13.947 187156 WARNING nova.compute.manager [req-f40ee25d-8b71-4cd2-8676-62867c79d57f req-27e70753-f7f7-43ff-a067-72e4a4382286 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received unexpected event network-vif-unplugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:35:15 np0005539504 podman[242759]: 2025-11-29 07:35:15.764291839 +0000 UTC m=+0.101328033 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:35:15 np0005539504 nova_compute[187152]: 2025-11-29 07:35:15.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.292 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.675 187156 DEBUG nova.compute.manager [req-77b0c4d5-d38d-4964-b3d1-d89b7d75e57b req-33fe7f5b-b29e-4902-aad1-278a3f5b422d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.676 187156 DEBUG oslo_concurrency.lockutils [req-77b0c4d5-d38d-4964-b3d1-d89b7d75e57b req-33fe7f5b-b29e-4902-aad1-278a3f5b422d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.677 187156 DEBUG oslo_concurrency.lockutils [req-77b0c4d5-d38d-4964-b3d1-d89b7d75e57b req-33fe7f5b-b29e-4902-aad1-278a3f5b422d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.677 187156 DEBUG oslo_concurrency.lockutils [req-77b0c4d5-d38d-4964-b3d1-d89b7d75e57b req-33fe7f5b-b29e-4902-aad1-278a3f5b422d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.677 187156 DEBUG nova.compute.manager [req-77b0c4d5-d38d-4964-b3d1-d89b7d75e57b req-33fe7f5b-b29e-4902-aad1-278a3f5b422d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.678 187156 WARNING nova.compute.manager [req-77b0c4d5-d38d-4964-b3d1-d89b7d75e57b req-33fe7f5b-b29e-4902-aad1-278a3f5b422d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received unexpected event network-vif-plugged-72a72180-9b12-4a7f-9919-74bcdf0427b7 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.866 187156 DEBUG oslo_concurrency.lockutils [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.866 187156 DEBUG oslo_concurrency.lockutils [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.867 187156 DEBUG nova.network.neutron [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.876 187156 DEBUG nova.compute.manager [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-deleted-72a72180-9b12-4a7f-9919-74bcdf0427b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.876 187156 INFO nova.compute.manager [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Neutron deleted interface 72a72180-9b12-4a7f-9919-74bcdf0427b7; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:35:16 np0005539504 nova_compute[187152]: 2025-11-29 07:35:16.877 187156 DEBUG nova.network.neutron [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.069 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.308 187156 DEBUG nova.objects.instance [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'system_metadata' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.695 187156 DEBUG nova.objects.instance [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'flavor' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.757 187156 DEBUG nova.virt.libvirt.vif [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.757 187156 DEBUG nova.network.os_vif_util [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.758 187156 DEBUG nova.network.os_vif_util [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.761 187156 DEBUG nova.virt.libvirt.guest [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.764 187156 DEBUG nova.virt.libvirt.guest [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface>not found in domain: <domain type='kvm' id='77'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <name>instance-00000094</name>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <uuid>949386c1-7dd6-4ddb-89b1-4762db3984dd</uuid>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:11</nova:creationTime>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <memory unit='KiB'>131072</memory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <resource>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <partition>/machine</partition>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </resource>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <sysinfo type='smbios'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='serial'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='uuid'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <boot dev='hd'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <smbios mode='sysinfo'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <vmcoreinfo state='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <feature policy='require' name='x2apic'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <feature policy='require' name='vme'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <clock offset='utc'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <timer name='hpet' present='no'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <on_reboot>restart</on_reboot>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <on_crash>destroy</on_crash>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <disk type='file' device='disk'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk' index='2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <backingStore type='file' index='3'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <format type='raw'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <backingStore/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      </backingStore>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target dev='vda' bus='virtio'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='virtio-disk0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <disk type='file' device='cdrom'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config' index='1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <backingStore/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target dev='sda' bus='sata'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <readonly/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='sata0-0-0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pcie.0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='1' port='0x10'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='2' port='0x11'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='3' port='0x12'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='4' port='0x13'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='5' port='0x14'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='6' port='0x15'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='7' port='0x16'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='8' port='0x17'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.8'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='9' port='0x18'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.9'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='10' port='0x19'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.10'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='11' port='0x1a'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.11'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='12' port='0x1b'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.12'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='13' port='0x1c'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.13'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='14' port='0x1d'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.14'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='15' port='0x1e'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.15'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='16' port='0x1f'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.16'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='17' port='0x20'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.17'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='18' port='0x21'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.18'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='19' port='0x22'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.19'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='20' port='0x23'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.20'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='21' port='0x24'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.21'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='22' port='0x25'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.22'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='23' port='0x26'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.23'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='24' port='0x27'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.24'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='25' port='0x28'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.25'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-pci-bridge'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.26'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='usb'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='sata' index='0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='ide'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:37:7b:df'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target dev='tapc31397b6-70'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='net0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <serial type='pty'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target type='isa-serial' port='0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <model name='isa-serial'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      </target>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target type='serial' port='0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </console>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <input type='tablet' bus='usb'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='input0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <input type='mouse' bus='ps2'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='input1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <input type='keyboard' bus='ps2'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='input2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <listen type='address' address='::0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <audio id='1' type='none'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='video0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <watchdog model='itco' action='reset'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='watchdog0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </watchdog>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <memballoon model='virtio'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <stats period='10'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='balloon0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <rng model='virtio'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='rng0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <label>system_u:system_r:svirt_t:s0:c213,c858</label>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c213,c858</imagelabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <label>+107:+107</label>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.766 187156 DEBUG nova.virt.libvirt.guest [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.771 187156 DEBUG nova.virt.libvirt.guest [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ef:d0:78"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap72a72180-9b"/></interface>not found in domain: <domain type='kvm' id='77'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <name>instance-00000094</name>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <uuid>949386c1-7dd6-4ddb-89b1-4762db3984dd</uuid>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:11</nova:creationTime>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <memory unit='KiB'>131072</memory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <resource>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <partition>/machine</partition>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </resource>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <sysinfo type='smbios'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='serial'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='uuid'>949386c1-7dd6-4ddb-89b1-4762db3984dd</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <boot dev='hd'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <smbios mode='sysinfo'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <vmcoreinfo state='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <feature policy='require' name='x2apic'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <feature policy='require' name='vme'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <clock offset='utc'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <timer name='hpet' present='no'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <on_reboot>restart</on_reboot>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <on_crash>destroy</on_crash>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <disk type='file' device='disk'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk' index='2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <backingStore type='file' index='3'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <format type='raw'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <backingStore/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      </backingStore>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target dev='vda' bus='virtio'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='virtio-disk0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <disk type='file' device='cdrom'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/disk.config' index='1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <backingStore/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target dev='sda' bus='sata'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <readonly/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='sata0-0-0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pcie.0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='1' port='0x10'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='2' port='0x11'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='3' port='0x12'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='4' port='0x13'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='5' port='0x14'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='6' port='0x15'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='7' port='0x16'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='8' port='0x17'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.8'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='9' port='0x18'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.9'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='10' port='0x19'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.10'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='11' port='0x1a'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.11'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='12' port='0x1b'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.12'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='13' port='0x1c'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.13'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='14' port='0x1d'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.14'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='15' port='0x1e'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.15'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='16' port='0x1f'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.16'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='17' port='0x20'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.17'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='18' port='0x21'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.18'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='19' port='0x22'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.19'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='20' port='0x23'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.20'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='21' port='0x24'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.21'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='22' port='0x25'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.22'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='23' port='0x26'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.23'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='24' port='0x27'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.24'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-root-port'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target chassis='25' port='0x28'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.25'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model name='pcie-pci-bridge'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='pci.26'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='usb'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <controller type='sata' index='0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='ide'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </controller>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <interface type='ethernet'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <mac address='fa:16:3e:37:7b:df'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target dev='tapc31397b6-70'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model type='virtio'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <mtu size='1442'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='net0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <serial type='pty'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target type='isa-serial' port='0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:        <model name='isa-serial'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      </target>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <source path='/dev/pts/0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <log file='/var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd/console.log' append='off'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <target type='serial' port='0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='serial0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </console>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <input type='tablet' bus='usb'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='input0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <input type='mouse' bus='ps2'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='input1'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <input type='keyboard' bus='ps2'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='input2'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </input>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <listen type='address' address='::0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </graphics>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <audio id='1' type='none'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='video0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <watchdog model='itco' action='reset'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='watchdog0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </watchdog>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <memballoon model='virtio'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <stats period='10'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='balloon0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <rng model='virtio'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <alias name='rng0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <label>system_u:system_r:svirt_t:s0:c213,c858</label>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c213,c858</imagelabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <label>+107:+107</label>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </seclabel>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.772 187156 WARNING nova.virt.libvirt.driver [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Detaching interface fa:16:3e:ef:d0:78 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap72a72180-9b' not found.#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.774 187156 DEBUG nova.virt.libvirt.vif [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.775 187156 DEBUG nova.network.os_vif_util [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "address": "fa:16:3e:ef:d0:78", "network": {"id": "9eacd38c-2429-49d2-8f36-a26cde18927c", "bridge": "br-int", "label": "tempest-network-smoke--146637774", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.42", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72a72180-9b", "ovs_interfaceid": "72a72180-9b12-4a7f-9919-74bcdf0427b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.776 187156 DEBUG nova.network.os_vif_util [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.776 187156 DEBUG os_vif [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.779 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.780 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72a72180-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.781 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.784 187156 INFO os_vif [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:d0:78,bridge_name='br-int',has_traffic_filtering=True,id=72a72180-9b12-4a7f-9919-74bcdf0427b7,network=Network(9eacd38c-2429-49d2-8f36-a26cde18927c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72a72180-9b')#033[00m
Nov 29 02:35:17 np0005539504 nova_compute[187152]: 2025-11-29 07:35:17.786 187156 DEBUG nova.virt.libvirt.guest [req-737b1c7c-0f04-42a2-9ad1-7e734a7460ab req-598a4041-283b-4f1c-ad60-faea48c6eb9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:name>tempest-TestNetworkBasicOps-server-1269463687</nova:name>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:creationTime>2025-11-29 07:35:17</nova:creationTime>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:flavor name="m1.nano">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:memory>128</nova:memory>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:disk>1</nova:disk>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:swap>0</nova:swap>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:flavor>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:owner>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:owner>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  <nova:ports>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    <nova:port uuid="c31397b6-7050-41a8-be27-e6dce253a00f">
Nov 29 02:35:17 np0005539504 nova_compute[187152]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:    </nova:port>
Nov 29 02:35:17 np0005539504 nova_compute[187152]:  </nova:ports>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: </nova:instance>
Nov 29 02:35:17 np0005539504 nova_compute[187152]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:35:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:18.176 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:18 np0005539504 nova_compute[187152]: 2025-11-29 07:35:18.176 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:18.177 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:35:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:18Z|00598|binding|INFO|Releasing lport 7ec9f3ca-4f03-4ca2-a4be-c55f9239510e from this chassis (sb_readonly=0)
Nov 29 02:35:18 np0005539504 nova_compute[187152]: 2025-11-29 07:35:18.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:19 np0005539504 nova_compute[187152]: 2025-11-29 07:35:19.589 187156 INFO nova.network.neutron [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Port 72a72180-9b12-4a7f-9919-74bcdf0427b7 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 29 02:35:19 np0005539504 nova_compute[187152]: 2025-11-29 07:35:19.591 187156 DEBUG nova.network.neutron [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:19 np0005539504 nova_compute[187152]: 2025-11-29 07:35:19.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:19 np0005539504 nova_compute[187152]: 2025-11-29 07:35:19.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:35:20 np0005539504 nova_compute[187152]: 2025-11-29 07:35:20.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:21 np0005539504 nova_compute[187152]: 2025-11-29 07:35:21.296 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:21 np0005539504 podman[242779]: 2025-11-29 07:35:21.722651049 +0000 UTC m=+0.062533850 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:35:22 np0005539504 nova_compute[187152]: 2025-11-29 07:35:22.071 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:22 np0005539504 nova_compute[187152]: 2025-11-29 07:35:22.720 187156 DEBUG oslo_concurrency.lockutils [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:22.980 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:22.981 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:22.981 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:23 np0005539504 nova_compute[187152]: 2025-11-29 07:35:23.036 187156 DEBUG oslo_concurrency.lockutils [None req-8285e379-ce95-4112-a442-d7df383fc375 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-949386c1-7dd6-4ddb-89b1-4762db3984dd-72a72180-9b12-4a7f-9919-74bcdf0427b7" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 11.973s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:23.180 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:24 np0005539504 nova_compute[187152]: 2025-11-29 07:35:24.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:24 np0005539504 nova_compute[187152]: 2025-11-29 07:35:24.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:35:24 np0005539504 nova_compute[187152]: 2025-11-29 07:35:24.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.332 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.333 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.333 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.333 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.508 187156 DEBUG nova.compute.manager [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-changed-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.508 187156 DEBUG nova.compute.manager [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing instance network info cache due to event network-changed-c31397b6-7050-41a8-be27-e6dce253a00f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:35:25 np0005539504 nova_compute[187152]: 2025-11-29 07:35:25.509 187156 DEBUG oslo_concurrency.lockutils [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.342 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.576 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.577 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.577 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.577 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.578 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.604 187156 INFO nova.compute.manager [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Terminating instance#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.736 187156 DEBUG nova.compute.manager [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:35:26 np0005539504 kernel: tapc31397b6-70 (unregistering): left promiscuous mode
Nov 29 02:35:26 np0005539504 NetworkManager[55210]: <info>  [1764401726.7670] device (tapc31397b6-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:35:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:26Z|00599|binding|INFO|Releasing lport c31397b6-7050-41a8-be27-e6dce253a00f from this chassis (sb_readonly=0)
Nov 29 02:35:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:26Z|00600|binding|INFO|Setting lport c31397b6-7050-41a8-be27-e6dce253a00f down in Southbound
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.772 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:35:26Z|00601|binding|INFO|Removing iface tapc31397b6-70 ovn-installed in OVS
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.774 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:26 np0005539504 nova_compute[187152]: 2025-11-29 07:35:26.790 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:26 np0005539504 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000094.scope: Deactivated successfully.
Nov 29 02:35:26 np0005539504 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000094.scope: Consumed 18.448s CPU time.
Nov 29 02:35:26 np0005539504 systemd-machined[153423]: Machine qemu-77-instance-00000094 terminated.
Nov 29 02:35:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:26.871 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:7b:df 10.100.0.3'], port_security=['fa:16:3e:37:7b:df 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '949386c1-7dd6-4ddb-89b1-4762db3984dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb20b368-92db-4c1f-add7-febda0cc12c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6630ec7d-a4a7-4251-aa89-7d94b5d92822', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d948d8aa-7d6c-4897-8da2-928d861f5745, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=c31397b6-7050-41a8-be27-e6dce253a00f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:26.873 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c31397b6-7050-41a8-be27-e6dce253a00f in datapath fb20b368-92db-4c1f-add7-febda0cc12c8 unbound from our chassis#033[00m
Nov 29 02:35:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:26.874 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb20b368-92db-4c1f-add7-febda0cc12c8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:35:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:26.875 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[70ea8741-3230-4bc3-b1de-d4298318a745]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:26.876 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8 namespace which is not needed anymore#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.004 187156 INFO nova.virt.libvirt.driver [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Instance destroyed successfully.#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.005 187156 DEBUG nova.objects.instance [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 949386c1-7dd6-4ddb-89b1-4762db3984dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.073 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.118 187156 DEBUG nova.virt.libvirt.vif [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1269463687',display_name='tempest-TestNetworkBasicOps-server-1269463687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1269463687',id=148,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOb15Ob4ul8MpodhGZarbCweKGgvYK2lstTZD8GYJJKTsyQpwGF/vTqiZC3chWrEJoPIe/KWCY11saH4Ylt12BrIZXSM2HMbp8f9mTimcCH5bVdp5+9Dw9WAoetWuMTBEg==',key_name='tempest-TestNetworkBasicOps-1618817236',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-989w739i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:37Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=949386c1-7dd6-4ddb-89b1-4762db3984dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.118 187156 DEBUG nova.network.os_vif_util [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.120 187156 DEBUG nova.network.os_vif_util [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.120 187156 DEBUG os_vif [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.123 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.125 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc31397b6-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.127 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.128 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.131 187156 INFO os_vif [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:7b:df,bridge_name='br-int',has_traffic_filtering=True,id=c31397b6-7050-41a8-be27-e6dce253a00f,network=Network(fb20b368-92db-4c1f-add7-febda0cc12c8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc31397b6-70')#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.132 187156 INFO nova.virt.libvirt.driver [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Deleting instance files /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd_del#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.132 187156 INFO nova.virt.libvirt.driver [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Deletion of /var/lib/nova/instances/949386c1-7dd6-4ddb-89b1-4762db3984dd_del complete#033[00m
Nov 29 02:35:27 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [NOTICE]   (242078) : haproxy version is 2.8.14-c23fe91
Nov 29 02:35:27 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [NOTICE]   (242078) : path to executable is /usr/sbin/haproxy
Nov 29 02:35:27 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [WARNING]  (242078) : Exiting Master process...
Nov 29 02:35:27 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [WARNING]  (242078) : Exiting Master process...
Nov 29 02:35:27 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [ALERT]    (242078) : Current worker (242080) exited with code 143 (Terminated)
Nov 29 02:35:27 np0005539504 neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8[242074]: [WARNING]  (242078) : All workers exited. Exiting... (0)
Nov 29 02:35:27 np0005539504 systemd[1]: libpod-313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a.scope: Deactivated successfully.
Nov 29 02:35:27 np0005539504 podman[242825]: 2025-11-29 07:35:27.175511983 +0000 UTC m=+0.209183323 container died 313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:35:27 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a-userdata-shm.mount: Deactivated successfully.
Nov 29 02:35:27 np0005539504 systemd[1]: var-lib-containers-storage-overlay-3038d3eec37997db500d51ddf12631cd2e78c0949eba3dd0dc01b9d0c38ff7a3-merged.mount: Deactivated successfully.
Nov 29 02:35:27 np0005539504 podman[242825]: 2025-11-29 07:35:27.416016133 +0000 UTC m=+0.449687473 container cleanup 313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:35:27 np0005539504 systemd[1]: libpod-conmon-313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a.scope: Deactivated successfully.
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.496 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.505 187156 INFO nova.compute.manager [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.507 187156 DEBUG oslo.service.loopingcall [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.507 187156 DEBUG nova.compute.manager [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.507 187156 DEBUG nova.network.neutron [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:35:27 np0005539504 podman[242867]: 2025-11-29 07:35:27.666152916 +0000 UTC m=+0.220920441 container remove 313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.673 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[75437d0c-c029-43a9-a526-f0cb2936bc2b]: (4, ('Sat Nov 29 07:35:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8 (313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a)\n313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a\nSat Nov 29 07:35:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8 (313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a)\n313c473c81e908ec84435f6749dfaaccf841a30e767a19a60862219f905d941a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.676 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1c228a-7afb-4c82-8681-a17374bbcacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.677 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb20b368-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.680 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 kernel: tapfb20b368-90: left promiscuous mode
Nov 29 02:35:27 np0005539504 nova_compute[187152]: 2025-11-29 07:35:27.692 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.697 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d70753ca-f501-4872-bc45-e4a45dca56ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.712 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[43b2bf31-6491-4082-9269-11bbb1876390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.714 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4234d386-8292-4441-a8bd-8b912031172d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.734 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[65ee70af-dc45-4879-896d-3f5def66ddd6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 707934, 'reachable_time': 21634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242884, 'error': None, 'target': 'ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.737 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fb20b368-92db-4c1f-add7-febda0cc12c8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:35:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:35:27.737 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[39f4dc08-f40a-4290-95b4-b78d15e0c6df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:27 np0005539504 systemd[1]: run-netns-ovnmeta\x2dfb20b368\x2d92db\x2d4c1f\x2dadd7\x2dfebda0cc12c8.mount: Deactivated successfully.
Nov 29 02:35:28 np0005539504 nova_compute[187152]: 2025-11-29 07:35:28.439 187156 DEBUG nova.compute.manager [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-unplugged-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:28 np0005539504 nova_compute[187152]: 2025-11-29 07:35:28.440 187156 DEBUG oslo_concurrency.lockutils [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:28 np0005539504 nova_compute[187152]: 2025-11-29 07:35:28.440 187156 DEBUG oslo_concurrency.lockutils [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:28 np0005539504 nova_compute[187152]: 2025-11-29 07:35:28.441 187156 DEBUG oslo_concurrency.lockutils [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:28 np0005539504 nova_compute[187152]: 2025-11-29 07:35:28.441 187156 DEBUG nova.compute.manager [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-unplugged-c31397b6-7050-41a8-be27-e6dce253a00f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:28 np0005539504 nova_compute[187152]: 2025-11-29 07:35:28.442 187156 DEBUG nova.compute.manager [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-unplugged-c31397b6-7050-41a8-be27-e6dce253a00f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.120 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [{"id": "c31397b6-7050-41a8-be27-e6dce253a00f", "address": "fa:16:3e:37:7b:df", "network": {"id": "fb20b368-92db-4c1f-add7-febda0cc12c8", "bridge": "br-int", "label": "tempest-network-smoke--352451708", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc31397b6-70", "ovs_interfaceid": "c31397b6-7050-41a8-be27-e6dce253a00f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.262 187156 DEBUG nova.network.neutron [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.336 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.338 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.338 187156 DEBUG oslo_concurrency.lockutils [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.339 187156 DEBUG nova.network.neutron [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Refreshing network info cache for port c31397b6-7050-41a8-be27-e6dce253a00f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.340 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.341 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.343 187156 INFO nova.compute.manager [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Took 2.84 seconds to deallocate network for instance.#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.654 187156 DEBUG nova.compute.manager [req-03c1ef55-83d6-4c24-bcf9-eba0de5299e6 req-f38082c1-ca61-4841-935a-b71fcba3f564 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-deleted-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.655 187156 DEBUG nova.compute.manager [req-00c54c7b-1677-4444-923b-001a5fb2433d req-3f12330c-717f-4d04-9882-8971bb211612 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.655 187156 DEBUG oslo_concurrency.lockutils [req-00c54c7b-1677-4444-923b-001a5fb2433d req-3f12330c-717f-4d04-9882-8971bb211612 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.656 187156 DEBUG oslo_concurrency.lockutils [req-00c54c7b-1677-4444-923b-001a5fb2433d req-3f12330c-717f-4d04-9882-8971bb211612 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.656 187156 DEBUG oslo_concurrency.lockutils [req-00c54c7b-1677-4444-923b-001a5fb2433d req-3f12330c-717f-4d04-9882-8971bb211612 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.656 187156 DEBUG nova.compute.manager [req-00c54c7b-1677-4444-923b-001a5fb2433d req-3f12330c-717f-4d04-9882-8971bb211612 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] No waiting events found dispatching network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.656 187156 WARNING nova.compute.manager [req-00c54c7b-1677-4444-923b-001a5fb2433d req-3f12330c-717f-4d04-9882-8971bb211612 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Received unexpected event network-vif-plugged-c31397b6-7050-41a8-be27-e6dce253a00f for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.690 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.690 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.690 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.691 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.705 187156 DEBUG nova.network.neutron [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.869 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.870 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5701MB free_disk=73.07479858398438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.870 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:30 np0005539504 nova_compute[187152]: 2025-11-29 07:35:30.871 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.061 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.138 187156 WARNING nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 949386c1-7dd6-4ddb-89b1-4762db3984dd is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.138 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.138 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.171 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.205 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.205 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.244 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.296 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.395 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.410 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.436 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.437 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.437 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.441 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.481 187156 INFO nova.scheduler.client.report [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 949386c1-7dd6-4ddb-89b1-4762db3984dd#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.581 187156 DEBUG nova.network.neutron [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.605 187156 DEBUG oslo_concurrency.lockutils [req-e2f0eeaf-2ae9-4c15-bd8b-05797ca13189 req-e5e66291-c45d-4e02-b35b-768d6f2d6a6c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-949386c1-7dd6-4ddb-89b1-4762db3984dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:31 np0005539504 nova_compute[187152]: 2025-11-29 07:35:31.637 187156 DEBUG oslo_concurrency.lockutils [None req-8272683b-c58b-494a-ab71-94ee3c62992d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "949386c1-7dd6-4ddb-89b1-4762db3984dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.075 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.127 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.272 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.272 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.288 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.441 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.442 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.447 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.448 187156 INFO nova.compute.claims [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.946 187156 DEBUG nova.compute.provider_tree [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:35:32 np0005539504 nova_compute[187152]: 2025-11-29 07:35:32.966 187156 DEBUG nova.scheduler.client.report [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.028 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.029 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.033 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.130 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.130 187156 DEBUG nova.network.neutron [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.150 187156 INFO nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.177 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.383 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.384 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.385 187156 INFO nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Creating image(s)#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.386 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "/var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.386 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "/var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.387 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "/var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.406 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.474 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.475 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.476 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.487 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.546 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.547 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.617 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk 1073741824" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.618 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.619 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.645 187156 DEBUG nova.policy [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0db44d5b07ab4caf927626f539adc8cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a82f08b594db4c92b19594b91420a641', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.681 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.682 187156 DEBUG nova.virt.disk.api [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Checking if we can resize image /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.683 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.746 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.747 187156 DEBUG nova.virt.disk.api [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Cannot resize image /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.748 187156 DEBUG nova.objects.instance [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'migration_context' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.795 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.796 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Ensure instance console log exists: /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.797 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.797 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:33 np0005539504 nova_compute[187152]: 2025-11-29 07:35:33.798 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:34 np0005539504 nova_compute[187152]: 2025-11-29 07:35:34.586 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:34 np0005539504 podman[242903]: 2025-11-29 07:35:34.729139221 +0000 UTC m=+0.056663369 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:35:34 np0005539504 podman[242902]: 2025-11-29 07:35:34.730206511 +0000 UTC m=+0.067258648 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64)
Nov 29 02:35:34 np0005539504 podman[242901]: 2025-11-29 07:35:34.750234085 +0000 UTC m=+0.088824773 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:35:35 np0005539504 nova_compute[187152]: 2025-11-29 07:35:35.923 187156 DEBUG nova.network.neutron [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Successfully created port: 40ceff87-ff89-4058-a1a5-020625a1887a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:35:37 np0005539504 nova_compute[187152]: 2025-11-29 07:35:37.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:37 np0005539504 nova_compute[187152]: 2025-11-29 07:35:37.129 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:38 np0005539504 podman[242963]: 2025-11-29 07:35:38.719228474 +0000 UTC m=+0.057439461 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:35:38 np0005539504 podman[242964]: 2025-11-29 07:35:38.780771885 +0000 UTC m=+0.109590846 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.131 187156 DEBUG nova.network.neutron [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Successfully updated port: 40ceff87-ff89-4058-a1a5-020625a1887a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.166 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.166 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquired lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.167 187156 DEBUG nova.network.neutron [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.290 187156 DEBUG nova.compute.manager [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-changed-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.290 187156 DEBUG nova.compute.manager [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Refreshing instance network info cache due to event network-changed-40ceff87-ff89-4058-a1a5-020625a1887a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:35:40 np0005539504 nova_compute[187152]: 2025-11-29 07:35:40.291 187156 DEBUG oslo_concurrency.lockutils [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:41 np0005539504 nova_compute[187152]: 2025-11-29 07:35:41.116 187156 DEBUG nova.network.neutron [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:35:42 np0005539504 nova_compute[187152]: 2025-11-29 07:35:42.004 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401727.002237, 949386c1-7dd6-4ddb-89b1-4762db3984dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:35:42 np0005539504 nova_compute[187152]: 2025-11-29 07:35:42.004 187156 INFO nova.compute.manager [-] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:35:42 np0005539504 nova_compute[187152]: 2025-11-29 07:35:42.080 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:42 np0005539504 nova_compute[187152]: 2025-11-29 07:35:42.132 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:43 np0005539504 nova_compute[187152]: 2025-11-29 07:35:43.399 187156 DEBUG nova.compute.manager [None req-59fb8dc4-dd8e-4b8a-8c6f-10945bf0f453 - - - - - -] [instance: 949386c1-7dd6-4ddb-89b1-4762db3984dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:35:46 np0005539504 podman[243015]: 2025-11-29 07:35:46.761794422 +0000 UTC m=+0.094676611 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 29 02:35:47 np0005539504 nova_compute[187152]: 2025-11-29 07:35:47.082 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:47 np0005539504 nova_compute[187152]: 2025-11-29 07:35:47.173 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:48 np0005539504 nova_compute[187152]: 2025-11-29 07:35:48.100 187156 DEBUG nova.network.neutron [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updating instance_info_cache with network_info: [{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:52 np0005539504 nova_compute[187152]: 2025-11-29 07:35:52.084 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:52 np0005539504 nova_compute[187152]: 2025-11-29 07:35:52.174 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:52 np0005539504 podman[243038]: 2025-11-29 07:35:52.773393318 +0000 UTC m=+0.098263799 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.008 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Releasing lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.009 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance network_info: |[{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.010 187156 DEBUG oslo_concurrency.lockutils [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.010 187156 DEBUG nova.network.neutron [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Refreshing network info cache for port 40ceff87-ff89-4058-a1a5-020625a1887a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.013 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Start _get_guest_xml network_info=[{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.021 187156 WARNING nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.031 187156 DEBUG nova.virt.libvirt.host [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.032 187156 DEBUG nova.virt.libvirt.host [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.036 187156 DEBUG nova.virt.libvirt.host [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.037 187156 DEBUG nova.virt.libvirt.host [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.039 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.039 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.040 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.040 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.040 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.040 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.040 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.041 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.041 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.041 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.041 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.042 187156 DEBUG nova.virt.hardware [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.048 187156 DEBUG nova.virt.libvirt.vif [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-684634725',display_name='tempest-TestServerAdvancedOps-server-684634725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-684634725',id=150,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a82f08b594db4c92b19594b91420a641',ramdisk_id='',reservation_id='r-9dxnjqtu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-403162681',owner_user_name='tempest-TestServerAdvancedOps-403162681-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:35:33Z,user_data=None,user_id='0db44d5b07ab4caf927626f539adc8cc',uuid=093053f4-2142-487d-9db6-8b83c9b91ed5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.049 187156 DEBUG nova.network.os_vif_util [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converting VIF {"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.050 187156 DEBUG nova.network.os_vif_util [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.052 187156 DEBUG nova.objects.instance [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.161 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <uuid>093053f4-2142-487d-9db6-8b83c9b91ed5</uuid>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <name>instance-00000096</name>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestServerAdvancedOps-server-684634725</nova:name>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:35:53</nova:creationTime>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:user uuid="0db44d5b07ab4caf927626f539adc8cc">tempest-TestServerAdvancedOps-403162681-project-member</nova:user>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:project uuid="a82f08b594db4c92b19594b91420a641">tempest-TestServerAdvancedOps-403162681</nova:project>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        <nova:port uuid="40ceff87-ff89-4058-a1a5-020625a1887a">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <entry name="serial">093053f4-2142-487d-9db6-8b83c9b91ed5</entry>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <entry name="uuid">093053f4-2142-487d-9db6-8b83c9b91ed5</entry>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.config"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:d5:f0:96"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <target dev="tap40ceff87-ff"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/console.log" append="off"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:35:53 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:35:53 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:35:53 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:35:53 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.162 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Preparing to wait for external event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.162 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.163 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.163 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.164 187156 DEBUG nova.virt.libvirt.vif [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-684634725',display_name='tempest-TestServerAdvancedOps-server-684634725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-684634725',id=150,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a82f08b594db4c92b19594b91420a641',ramdisk_id='',reservation_id='r-9dxnjqtu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-403162681',owner_user_name='tempest-TestServerAdvancedOps-403162681-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:35:33Z,user_data=None,user_id='0db44d5b07ab4caf927626f539adc8cc',uuid=093053f4-2142-487d-9db6-8b83c9b91ed5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.164 187156 DEBUG nova.network.os_vif_util [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converting VIF {"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.165 187156 DEBUG nova.network.os_vif_util [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.165 187156 DEBUG os_vif [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.166 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.166 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.167 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.173 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.174 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40ceff87-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.174 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap40ceff87-ff, col_values=(('external_ids', {'iface-id': '40ceff87-ff89-4058-a1a5-020625a1887a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:f0:96', 'vm-uuid': '093053f4-2142-487d-9db6-8b83c9b91ed5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.176 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:53 np0005539504 NetworkManager[55210]: <info>  [1764401753.1779] manager: (tap40ceff87-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.178 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.187 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:53 np0005539504 nova_compute[187152]: 2025-11-29 07:35:53.189 187156 INFO os_vif [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff')#033[00m
Nov 29 02:35:54 np0005539504 nova_compute[187152]: 2025-11-29 07:35:54.068 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:54 np0005539504 nova_compute[187152]: 2025-11-29 07:35:54.332 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:35:54 np0005539504 nova_compute[187152]: 2025-11-29 07:35:54.333 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:35:54 np0005539504 nova_compute[187152]: 2025-11-29 07:35:54.333 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] No VIF found with MAC fa:16:3e:d5:f0:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:35:54 np0005539504 nova_compute[187152]: 2025-11-29 07:35:54.334 187156 INFO nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Using config drive#033[00m
Nov 29 02:35:54 np0005539504 nova_compute[187152]: 2025-11-29 07:35:54.361 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:57 np0005539504 nova_compute[187152]: 2025-11-29 07:35:57.085 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:58 np0005539504 nova_compute[187152]: 2025-11-29 07:35:58.176 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.131 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.284 187156 INFO nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Creating config drive at /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.config#033[00m
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.291 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6m07qiv2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.419 187156 DEBUG oslo_concurrency.processutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6m07qiv2" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:36:02 np0005539504 kernel: tap40ceff87-ff: entered promiscuous mode
Nov 29 02:36:02 np0005539504 NetworkManager[55210]: <info>  [1764401762.5216] manager: (tap40ceff87-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Nov 29 02:36:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:02Z|00602|binding|INFO|Claiming lport 40ceff87-ff89-4058-a1a5-020625a1887a for this chassis.
Nov 29 02:36:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:02Z|00603|binding|INFO|40ceff87-ff89-4058-a1a5-020625a1887a: Claiming fa:16:3e:d5:f0:96 10.100.0.2
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.521 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539504 systemd-udevd[243085]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.558 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.561 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:02Z|00604|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a ovn-installed in OVS
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.562 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539504 NetworkManager[55210]: <info>  [1764401762.5716] device (tap40ceff87-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:36:02 np0005539504 systemd-machined[153423]: New machine qemu-78-instance-00000096.
Nov 29 02:36:02 np0005539504 NetworkManager[55210]: <info>  [1764401762.5740] device (tap40ceff87-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:36:02 np0005539504 systemd[1]: Started Virtual Machine qemu-78-instance-00000096.
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.893 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401762.892307, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:02 np0005539504 nova_compute[187152]: 2025-11-29 07:36:02.893 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Started (Lifecycle Event)#033[00m
Nov 29 02:36:03 np0005539504 nova_compute[187152]: 2025-11-29 07:36:03.212 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:03Z|00605|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a up in Southbound
Nov 29 02:36:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:03.709 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:f0:96 10.100.0.2'], port_security=['fa:16:3e:d5:f0:96 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf016381-209e-4fcd-a155-acc952170a94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82f08b594db4c92b19594b91420a641', 'neutron:revision_number': '2', 'neutron:security_group_ids': '04c157ee-9518-4c86-aa3d-4298341f48c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80f72a82-a5d9-4a82-9318-66b679841586, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=40ceff87-ff89-4058-a1a5-020625a1887a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:03.711 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 40ceff87-ff89-4058-a1a5-020625a1887a in datapath cf016381-209e-4fcd-a155-acc952170a94 bound to our chassis#033[00m
Nov 29 02:36:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:03.713 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf016381-209e-4fcd-a155-acc952170a94 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:36:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:03.716 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2795dbaf-3760-4ff3-9679-f93c7b35d88d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:36:03 np0005539504 nova_compute[187152]: 2025-11-29 07:36:03.882 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:03 np0005539504 nova_compute[187152]: 2025-11-29 07:36:03.889 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401762.892709, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:03 np0005539504 nova_compute[187152]: 2025-11-29 07:36:03.889 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:36:05 np0005539504 podman[243103]: 2025-11-29 07:36:05.745214542 +0000 UTC m=+0.081384671 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:36:05 np0005539504 podman[243104]: 2025-11-29 07:36:05.750893697 +0000 UTC m=+0.088341791 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, version=9.6, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:36:05 np0005539504 podman[243105]: 2025-11-29 07:36:05.751252776 +0000 UTC m=+0.078544114 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:36:07 np0005539504 nova_compute[187152]: 2025-11-29 07:36:07.134 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:07 np0005539504 nova_compute[187152]: 2025-11-29 07:36:07.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:08 np0005539504 nova_compute[187152]: 2025-11-29 07:36:08.215 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:08 np0005539504 nova_compute[187152]: 2025-11-29 07:36:08.513 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:08 np0005539504 nova_compute[187152]: 2025-11-29 07:36:08.517 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:08 np0005539504 nova_compute[187152]: 2025-11-29 07:36:08.976 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:36:09 np0005539504 nova_compute[187152]: 2025-11-29 07:36:09.235 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:09.236 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:09.238 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:36:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:09.239 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:09 np0005539504 nova_compute[187152]: 2025-11-29 07:36:09.363 187156 DEBUG nova.network.neutron [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updated VIF entry in instance network info cache for port 40ceff87-ff89-4058-a1a5-020625a1887a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:36:09 np0005539504 nova_compute[187152]: 2025-11-29 07:36:09.364 187156 DEBUG nova.network.neutron [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updating instance_info_cache with network_info: [{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:09 np0005539504 podman[243162]: 2025-11-29 07:36:09.7101105 +0000 UTC m=+0.050476151 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:36:09 np0005539504 podman[243163]: 2025-11-29 07:36:09.763233473 +0000 UTC m=+0.088935495 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:36:09 np0005539504 nova_compute[187152]: 2025-11-29 07:36:09.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:10 np0005539504 nova_compute[187152]: 2025-11-29 07:36:10.226 187156 DEBUG oslo_concurrency.lockutils [req-070ed1a5-1609-4061-b880-6120e3c331a1 req-a218dcb8-836d-4316-9b30-792b7b060b08 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.137 187156 DEBUG nova.compute.manager [req-739962ee-aecf-42a2-8b32-e6ee42d6e640 req-b4cdc90a-8479-4466-9ef5-742749084501 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.138 187156 DEBUG oslo_concurrency.lockutils [req-739962ee-aecf-42a2-8b32-e6ee42d6e640 req-b4cdc90a-8479-4466-9ef5-742749084501 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.139 187156 DEBUG oslo_concurrency.lockutils [req-739962ee-aecf-42a2-8b32-e6ee42d6e640 req-b4cdc90a-8479-4466-9ef5-742749084501 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.139 187156 DEBUG oslo_concurrency.lockutils [req-739962ee-aecf-42a2-8b32-e6ee42d6e640 req-b4cdc90a-8479-4466-9ef5-742749084501 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.139 187156 DEBUG nova.compute.manager [req-739962ee-aecf-42a2-8b32-e6ee42d6e640 req-b4cdc90a-8479-4466-9ef5-742749084501 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Processing event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.140 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.145 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401771.1451824, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.146 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.149 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.154 187156 INFO nova.virt.libvirt.driver [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance spawned successfully.#033[00m
Nov 29 02:36:11 np0005539504 nova_compute[187152]: 2025-11-29 07:36:11.155 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.207 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.881 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.887 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.887 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.888 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.888 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.889 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.889 187156 DEBUG nova.virt.libvirt.driver [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:36:12 np0005539504 nova_compute[187152]: 2025-11-29 07:36:12.896 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:13 np0005539504 nova_compute[187152]: 2025-11-29 07:36:13.217 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:13 np0005539504 nova_compute[187152]: 2025-11-29 07:36:13.357 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.137 187156 INFO nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Took 41.75 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.138 187156 DEBUG nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.178 187156 DEBUG nova.compute.manager [req-1831297e-382a-4da1-9200-f69565dbdac1 req-36136ffe-1782-4560-903a-6298c3f48398 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.179 187156 DEBUG oslo_concurrency.lockutils [req-1831297e-382a-4da1-9200-f69565dbdac1 req-36136ffe-1782-4560-903a-6298c3f48398 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.179 187156 DEBUG oslo_concurrency.lockutils [req-1831297e-382a-4da1-9200-f69565dbdac1 req-36136ffe-1782-4560-903a-6298c3f48398 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.179 187156 DEBUG oslo_concurrency.lockutils [req-1831297e-382a-4da1-9200-f69565dbdac1 req-36136ffe-1782-4560-903a-6298c3f48398 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.179 187156 DEBUG nova.compute.manager [req-1831297e-382a-4da1-9200-f69565dbdac1 req-36136ffe-1782-4560-903a-6298c3f48398 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.180 187156 WARNING nova.compute.manager [req-1831297e-382a-4da1-9200-f69565dbdac1 req-36136ffe-1782-4560-903a-6298c3f48398 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.346 187156 INFO nova.compute.manager [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Took 42.97 seconds to build instance.#033[00m
Nov 29 02:36:15 np0005539504 nova_compute[187152]: 2025-11-29 07:36:15.378 187156 DEBUG oslo_concurrency.lockutils [None req-44b1f58d-f4b4-4e81-82e3-e4c4d6b86cbc 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 43.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:16 np0005539504 nova_compute[187152]: 2025-11-29 07:36:16.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:17 np0005539504 nova_compute[187152]: 2025-11-29 07:36:17.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:17 np0005539504 podman[243216]: 2025-11-29 07:36:17.756970285 +0000 UTC m=+0.093069678 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:36:18 np0005539504 nova_compute[187152]: 2025-11-29 07:36:18.287 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:20 np0005539504 nova_compute[187152]: 2025-11-29 07:36:20.257 187156 DEBUG nova.objects.instance [None req-e0475ff8-2d82-495d-82e8-2074443895ab 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:20 np0005539504 nova_compute[187152]: 2025-11-29 07:36:20.403 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401780.4029973, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:20 np0005539504 nova_compute[187152]: 2025-11-29 07:36:20.404 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:36:20 np0005539504 nova_compute[187152]: 2025-11-29 07:36:20.734 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:20 np0005539504 nova_compute[187152]: 2025-11-29 07:36:20.742 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:20 np0005539504 nova_compute[187152]: 2025-11-29 07:36:20.807 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 29 02:36:21 np0005539504 kernel: tap40ceff87-ff (unregistering): left promiscuous mode
Nov 29 02:36:21 np0005539504 NetworkManager[55210]: <info>  [1764401781.3559] device (tap40ceff87-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:36:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:21Z|00606|binding|INFO|Releasing lport 40ceff87-ff89-4058-a1a5-020625a1887a from this chassis (sb_readonly=0)
Nov 29 02:36:21 np0005539504 nova_compute[187152]: 2025-11-29 07:36:21.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:21Z|00607|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a down in Southbound
Nov 29 02:36:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:21Z|00608|binding|INFO|Removing iface tap40ceff87-ff ovn-installed in OVS
Nov 29 02:36:21 np0005539504 nova_compute[187152]: 2025-11-29 07:36:21.407 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:21 np0005539504 nova_compute[187152]: 2025-11-29 07:36:21.420 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:21 np0005539504 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000096.scope: Deactivated successfully.
Nov 29 02:36:21 np0005539504 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000096.scope: Consumed 9.777s CPU time.
Nov 29 02:36:21 np0005539504 systemd-machined[153423]: Machine qemu-78-instance-00000096 terminated.
Nov 29 02:36:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:21.520 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:f0:96 10.100.0.2'], port_security=['fa:16:3e:d5:f0:96 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf016381-209e-4fcd-a155-acc952170a94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82f08b594db4c92b19594b91420a641', 'neutron:revision_number': '4', 'neutron:security_group_ids': '04c157ee-9518-4c86-aa3d-4298341f48c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80f72a82-a5d9-4a82-9318-66b679841586, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=40ceff87-ff89-4058-a1a5-020625a1887a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:21.522 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 40ceff87-ff89-4058-a1a5-020625a1887a in datapath cf016381-209e-4fcd-a155-acc952170a94 unbound from our chassis#033[00m
Nov 29 02:36:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:21.524 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf016381-209e-4fcd-a155-acc952170a94 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:36:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:21.526 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5860ade4-9a19-4355-b07b-b9f9e9c84f12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:36:21 np0005539504 nova_compute[187152]: 2025-11-29 07:36:21.598 187156 DEBUG nova.compute.manager [None req-e0475ff8-2d82-495d-82e8-2074443895ab 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:21 np0005539504 nova_compute[187152]: 2025-11-29 07:36:21.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:21 np0005539504 nova_compute[187152]: 2025-11-29 07:36:21.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.211 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.509 187156 DEBUG nova.compute.manager [req-54a120ec-c020-4a62-b3ce-286ef560b27e req-fa6367f4-7090-4028-a51c-4df7ff740bee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.510 187156 DEBUG oslo_concurrency.lockutils [req-54a120ec-c020-4a62-b3ce-286ef560b27e req-fa6367f4-7090-4028-a51c-4df7ff740bee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.511 187156 DEBUG oslo_concurrency.lockutils [req-54a120ec-c020-4a62-b3ce-286ef560b27e req-fa6367f4-7090-4028-a51c-4df7ff740bee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.511 187156 DEBUG oslo_concurrency.lockutils [req-54a120ec-c020-4a62-b3ce-286ef560b27e req-fa6367f4-7090-4028-a51c-4df7ff740bee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.512 187156 DEBUG nova.compute.manager [req-54a120ec-c020-4a62-b3ce-286ef560b27e req-fa6367f4-7090-4028-a51c-4df7ff740bee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.512 187156 WARNING nova.compute.manager [req-54a120ec-c020-4a62-b3ce-286ef560b27e req-fa6367f4-7090-4028-a51c-4df7ff740bee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state suspended and task_state None.#033[00m
Nov 29 02:36:22 np0005539504 nova_compute[187152]: 2025-11-29 07:36:22.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:22.982 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:22.983 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:22.983 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:23 np0005539504 nova_compute[187152]: 2025-11-29 07:36:23.291 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:24 np0005539504 podman[243263]: 2025-11-29 07:36:24.62395005 +0000 UTC m=+0.137689610 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 02:36:24 np0005539504 nova_compute[187152]: 2025-11-29 07:36:24.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.352 187156 DEBUG nova.compute.manager [req-cdd8683a-8f36-4fb4-8c99-ce5fe0eb5a2c req-29ae4fed-68f5-489a-b29c-e564116c5049 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.352 187156 DEBUG oslo_concurrency.lockutils [req-cdd8683a-8f36-4fb4-8c99-ce5fe0eb5a2c req-29ae4fed-68f5-489a-b29c-e564116c5049 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.353 187156 DEBUG oslo_concurrency.lockutils [req-cdd8683a-8f36-4fb4-8c99-ce5fe0eb5a2c req-29ae4fed-68f5-489a-b29c-e564116c5049 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.353 187156 DEBUG oslo_concurrency.lockutils [req-cdd8683a-8f36-4fb4-8c99-ce5fe0eb5a2c req-29ae4fed-68f5-489a-b29c-e564116c5049 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.353 187156 DEBUG nova.compute.manager [req-cdd8683a-8f36-4fb4-8c99-ce5fe0eb5a2c req-29ae4fed-68f5-489a-b29c-e564116c5049 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.354 187156 WARNING nova.compute.manager [req-cdd8683a-8f36-4fb4-8c99-ce5fe0eb5a2c req-29ae4fed-68f5-489a-b29c-e564116c5049 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state suspended and task_state None.#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:36:25 np0005539504 nova_compute[187152]: 2025-11-29 07:36:25.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.215 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.271 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.272 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.272 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.272 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.693 187156 INFO nova.compute.manager [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Resuming#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.695 187156 DEBUG nova.objects.instance [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'flavor' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:27 np0005539504 nova_compute[187152]: 2025-11-29 07:36:27.801 187156 DEBUG oslo_concurrency.lockutils [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:36:28 np0005539504 nova_compute[187152]: 2025-11-29 07:36:28.294 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:32 np0005539504 nova_compute[187152]: 2025-11-29 07:36:32.217 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:33 np0005539504 nova_compute[187152]: 2025-11-29 07:36:33.007 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updating instance_info_cache with network_info: [{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:33 np0005539504 nova_compute[187152]: 2025-11-29 07:36:33.296 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.014 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.015 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.016 187156 DEBUG oslo_concurrency.lockutils [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquired lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.016 187156 DEBUG nova.network.neutron [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.018 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.018 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.752 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.753 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.753 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.754 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:36:35 np0005539504 podman[243287]: 2025-11-29 07:36:35.89582923 +0000 UTC m=+0.074693239 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:36:35 np0005539504 podman[243289]: 2025-11-29 07:36:35.898773581 +0000 UTC m=+0.069513409 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:36:35 np0005539504 podman[243288]: 2025-11-29 07:36:35.898773231 +0000 UTC m=+0.076646713 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 29 02:36:35 np0005539504 nova_compute[187152]: 2025-11-29 07:36:35.927 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.001 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.002 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.067 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.206 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.209 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5648MB free_disk=72.9983024597168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.210 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.210 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.600 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401781.5985878, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:36 np0005539504 nova_compute[187152]: 2025-11-29 07:36:36.601 187156 INFO nova.compute.manager [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.016 187156 DEBUG nova.compute.manager [None req-0d020411-e24b-490b-b866-950347cef20f - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.025 187156 DEBUG nova.compute.manager [None req-0d020411-e24b-490b-b866-950347cef20f - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.069 187156 INFO nova.compute.manager [None req-0d020411-e24b-490b-b866-950347cef20f - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.106 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.107 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.107 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.163 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.179 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.220 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.224 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.225 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.923 187156 DEBUG nova.network.neutron [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updating instance_info_cache with network_info: [{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.942 187156 DEBUG oslo_concurrency.lockutils [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Releasing lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.949 187156 DEBUG nova.virt.libvirt.vif [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-684634725',display_name='tempest-TestServerAdvancedOps-server-684634725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-684634725',id=150,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:36:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='a82f08b594db4c92b19594b91420a641',ramdisk_id='',reservation_id='r-9dxnjqtu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-403162681',owner_user_name='tempest-TestServerAdvancedOps-403162681-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:36:21Z,user_data=None,user_id='0db44d5b07ab4caf927626f539adc8cc',uuid=093053f4-2142-487d-9db6-8b83c9b91ed5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.949 187156 DEBUG nova.network.os_vif_util [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converting VIF {"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.950 187156 DEBUG nova.network.os_vif_util [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.951 187156 DEBUG os_vif [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.951 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.952 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.952 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.956 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40ceff87-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.956 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap40ceff87-ff, col_values=(('external_ids', {'iface-id': '40ceff87-ff89-4058-a1a5-020625a1887a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:f0:96', 'vm-uuid': '093053f4-2142-487d-9db6-8b83c9b91ed5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.957 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.957 187156 INFO os_vif [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff')#033[00m
Nov 29 02:36:37 np0005539504 nova_compute[187152]: 2025-11-29 07:36:37.973 187156 DEBUG nova.objects.instance [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'numa_topology' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:38 np0005539504 kernel: tap40ceff87-ff: entered promiscuous mode
Nov 29 02:36:38 np0005539504 NetworkManager[55210]: <info>  [1764401798.0759] manager: (tap40ceff87-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Nov 29 02:36:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:38Z|00609|binding|INFO|Claiming lport 40ceff87-ff89-4058-a1a5-020625a1887a for this chassis.
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:38Z|00610|binding|INFO|40ceff87-ff89-4058-a1a5-020625a1887a: Claiming fa:16:3e:d5:f0:96 10.100.0.2
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.080 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:38.086 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:f0:96 10.100.0.2'], port_security=['fa:16:3e:d5:f0:96 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf016381-209e-4fcd-a155-acc952170a94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82f08b594db4c92b19594b91420a641', 'neutron:revision_number': '5', 'neutron:security_group_ids': '04c157ee-9518-4c86-aa3d-4298341f48c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80f72a82-a5d9-4a82-9318-66b679841586, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=40ceff87-ff89-4058-a1a5-020625a1887a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:38.087 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 40ceff87-ff89-4058-a1a5-020625a1887a in datapath cf016381-209e-4fcd-a155-acc952170a94 bound to our chassis#033[00m
Nov 29 02:36:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:38.088 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf016381-209e-4fcd-a155-acc952170a94 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:36:38 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:38.090 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[64f22f1b-84bb-4e5f-97a4-f388a5382ebf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:36:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:38Z|00611|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a up in Southbound
Nov 29 02:36:38 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:38Z|00612|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a ovn-installed in OVS
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.091 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.092 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.097 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:38 np0005539504 systemd-udevd[243367]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:36:38 np0005539504 systemd-machined[153423]: New machine qemu-79-instance-00000096.
Nov 29 02:36:38 np0005539504 NetworkManager[55210]: <info>  [1764401798.1286] device (tap40ceff87-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:36:38 np0005539504 NetworkManager[55210]: <info>  [1764401798.1293] device (tap40ceff87-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:36:38 np0005539504 systemd[1]: Started Virtual Machine qemu-79-instance-00000096.
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.298 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.430 187156 DEBUG nova.compute.manager [req-c59a9c56-9ba5-49d8-ba23-04a50aafff51 req-fed70326-344d-4c90-bcbf-d9d881b7f431 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.430 187156 DEBUG oslo_concurrency.lockutils [req-c59a9c56-9ba5-49d8-ba23-04a50aafff51 req-fed70326-344d-4c90-bcbf-d9d881b7f431 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.431 187156 DEBUG oslo_concurrency.lockutils [req-c59a9c56-9ba5-49d8-ba23-04a50aafff51 req-fed70326-344d-4c90-bcbf-d9d881b7f431 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.431 187156 DEBUG oslo_concurrency.lockutils [req-c59a9c56-9ba5-49d8-ba23-04a50aafff51 req-fed70326-344d-4c90-bcbf-d9d881b7f431 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.431 187156 DEBUG nova.compute.manager [req-c59a9c56-9ba5-49d8-ba23-04a50aafff51 req-fed70326-344d-4c90-bcbf-d9d881b7f431 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.432 187156 WARNING nova.compute.manager [req-c59a9c56-9ba5-49d8-ba23-04a50aafff51 req-fed70326-344d-4c90-bcbf-d9d881b7f431 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state suspended and task_state resuming.#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.742 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401798.7420921, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.743 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Started (Lifecycle Event)#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.762 187156 DEBUG nova.compute.manager [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.763 187156 DEBUG nova.objects.instance [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.973 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.980 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.984 187156 INFO nova.virt.libvirt.driver [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance running successfully.#033[00m
Nov 29 02:36:38 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.988 187156 DEBUG nova.virt.libvirt.guest [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:36:38 np0005539504 nova_compute[187152]: 2025-11-29 07:36:38.989 187156 DEBUG nova.compute.manager [None req-c68ac635-d928-44b8-8bcd-4f5154a0f3b4 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:39 np0005539504 nova_compute[187152]: 2025-11-29 07:36:39.001 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 29 02:36:39 np0005539504 nova_compute[187152]: 2025-11-29 07:36:39.001 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401798.7489684, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:39 np0005539504 nova_compute[187152]: 2025-11-29 07:36:39.002 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:36:39 np0005539504 nova_compute[187152]: 2025-11-29 07:36:39.033 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:39 np0005539504 nova_compute[187152]: 2025-11-29 07:36:39.037 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:39 np0005539504 nova_compute[187152]: 2025-11-29 07:36:39.061 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 29 02:36:40 np0005539504 nova_compute[187152]: 2025-11-29 07:36:40.553 187156 DEBUG nova.compute.manager [req-9ab4e4db-5d48-4a86-a5ff-aef69e671300 req-d9b1afc8-bf5f-4c5b-947a-68b9dbd6a83a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:40 np0005539504 nova_compute[187152]: 2025-11-29 07:36:40.553 187156 DEBUG oslo_concurrency.lockutils [req-9ab4e4db-5d48-4a86-a5ff-aef69e671300 req-d9b1afc8-bf5f-4c5b-947a-68b9dbd6a83a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:40 np0005539504 nova_compute[187152]: 2025-11-29 07:36:40.554 187156 DEBUG oslo_concurrency.lockutils [req-9ab4e4db-5d48-4a86-a5ff-aef69e671300 req-d9b1afc8-bf5f-4c5b-947a-68b9dbd6a83a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:40 np0005539504 nova_compute[187152]: 2025-11-29 07:36:40.554 187156 DEBUG oslo_concurrency.lockutils [req-9ab4e4db-5d48-4a86-a5ff-aef69e671300 req-d9b1afc8-bf5f-4c5b-947a-68b9dbd6a83a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:40 np0005539504 nova_compute[187152]: 2025-11-29 07:36:40.554 187156 DEBUG nova.compute.manager [req-9ab4e4db-5d48-4a86-a5ff-aef69e671300 req-d9b1afc8-bf5f-4c5b-947a-68b9dbd6a83a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:40 np0005539504 nova_compute[187152]: 2025-11-29 07:36:40.554 187156 WARNING nova.compute.manager [req-9ab4e4db-5d48-4a86-a5ff-aef69e671300 req-d9b1afc8-bf5f-4c5b-947a-68b9dbd6a83a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state active and task_state None.#033[00m
Nov 29 02:36:40 np0005539504 podman[243384]: 2025-11-29 07:36:40.742460712 +0000 UTC m=+0.066967760 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:36:40 np0005539504 podman[243385]: 2025-11-29 07:36:40.8016813 +0000 UTC m=+0.127152564 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:36:41 np0005539504 nova_compute[187152]: 2025-11-29 07:36:41.884 187156 DEBUG nova.objects.instance [None req-61943496-2af7-48d0-b2f0-2e8d2f97aeda 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:41 np0005539504 nova_compute[187152]: 2025-11-29 07:36:41.911 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401801.911219, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:41 np0005539504 nova_compute[187152]: 2025-11-29 07:36:41.911 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:36:41 np0005539504 nova_compute[187152]: 2025-11-29 07:36:41.934 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:41 np0005539504 nova_compute[187152]: 2025-11-29 07:36:41.938 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:41 np0005539504 nova_compute[187152]: 2025-11-29 07:36:41.971 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 29 02:36:42 np0005539504 nova_compute[187152]: 2025-11-29 07:36:42.221 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:43 np0005539504 kernel: tap40ceff87-ff (unregistering): left promiscuous mode
Nov 29 02:36:43 np0005539504 NetworkManager[55210]: <info>  [1764401803.2314] device (tap40ceff87-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:36:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:43Z|00613|binding|INFO|Releasing lport 40ceff87-ff89-4058-a1a5-020625a1887a from this chassis (sb_readonly=0)
Nov 29 02:36:43 np0005539504 nova_compute[187152]: 2025-11-29 07:36:43.240 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:43Z|00614|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a down in Southbound
Nov 29 02:36:43 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:43Z|00615|binding|INFO|Removing iface tap40ceff87-ff ovn-installed in OVS
Nov 29 02:36:43 np0005539504 nova_compute[187152]: 2025-11-29 07:36:43.254 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:43 np0005539504 nova_compute[187152]: 2025-11-29 07:36:43.305 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:43.308 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:f0:96 10.100.0.2'], port_security=['fa:16:3e:d5:f0:96 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf016381-209e-4fcd-a155-acc952170a94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82f08b594db4c92b19594b91420a641', 'neutron:revision_number': '6', 'neutron:security_group_ids': '04c157ee-9518-4c86-aa3d-4298341f48c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80f72a82-a5d9-4a82-9318-66b679841586, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=40ceff87-ff89-4058-a1a5-020625a1887a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:43.310 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 40ceff87-ff89-4058-a1a5-020625a1887a in datapath cf016381-209e-4fcd-a155-acc952170a94 unbound from our chassis#033[00m
Nov 29 02:36:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:43.311 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf016381-209e-4fcd-a155-acc952170a94 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:36:43 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:43.313 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e684530a-a08f-4981-9348-0b6a8898cadc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:36:43 np0005539504 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000096.scope: Deactivated successfully.
Nov 29 02:36:43 np0005539504 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000096.scope: Consumed 3.944s CPU time.
Nov 29 02:36:43 np0005539504 systemd-machined[153423]: Machine qemu-79-instance-00000096 terminated.
Nov 29 02:36:43 np0005539504 nova_compute[187152]: 2025-11-29 07:36:43.431 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:43 np0005539504 nova_compute[187152]: 2025-11-29 07:36:43.438 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:43 np0005539504 nova_compute[187152]: 2025-11-29 07:36:43.467 187156 DEBUG nova.compute.manager [None req-61943496-2af7-48d0-b2f0-2e8d2f97aeda 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.466 187156 DEBUG nova.compute.manager [req-13f689f2-23ea-479c-b9fd-e7ba2478b364 req-bd022d73-e282-4b17-8e3a-b6b5c4e2f94a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.468 187156 DEBUG oslo_concurrency.lockutils [req-13f689f2-23ea-479c-b9fd-e7ba2478b364 req-bd022d73-e282-4b17-8e3a-b6b5c4e2f94a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.468 187156 DEBUG oslo_concurrency.lockutils [req-13f689f2-23ea-479c-b9fd-e7ba2478b364 req-bd022d73-e282-4b17-8e3a-b6b5c4e2f94a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.468 187156 DEBUG oslo_concurrency.lockutils [req-13f689f2-23ea-479c-b9fd-e7ba2478b364 req-bd022d73-e282-4b17-8e3a-b6b5c4e2f94a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.469 187156 DEBUG nova.compute.manager [req-13f689f2-23ea-479c-b9fd-e7ba2478b364 req-bd022d73-e282-4b17-8e3a-b6b5c4e2f94a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.469 187156 WARNING nova.compute.manager [req-13f689f2-23ea-479c-b9fd-e7ba2478b364 req-bd022d73-e282-4b17-8e3a-b6b5c4e2f94a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state suspended and task_state None.#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.699 187156 INFO nova.compute.manager [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Resuming#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.700 187156 DEBUG nova.objects.instance [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'flavor' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.748 187156 DEBUG oslo_concurrency.lockutils [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.748 187156 DEBUG oslo_concurrency.lockutils [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquired lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:36:46 np0005539504 nova_compute[187152]: 2025-11-29 07:36:46.749 187156 DEBUG nova.network.neutron [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:36:47 np0005539504 nova_compute[187152]: 2025-11-29 07:36:47.225 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.987 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'name': 'tempest-TestServerAdvancedOps-server-684634725', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000096', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'a82f08b594db4c92b19594b91420a641', 'user_id': '0db44d5b07ab4caf927626f539adc8cc', 'hostId': '5ac1a57e8c4f29fe94a1481f406a2ef629972eb512cc7d8158e66a8d', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.990 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.991 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.992 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.993 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.994 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.995 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.996 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.997 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.998 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:47.999 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.000 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.001 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.001 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.001 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>]
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.003 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.003 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.004 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.005 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.006 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.006 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>]
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.006 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.007 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>]
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.008 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.008 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.009 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-684634725>]
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.012 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.013 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.014 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:36:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:36:48.015 12 DEBUG ceilometer.compute.pollsters [-] Instance 093053f4-2142-487d-9db6-8b83c9b91ed5 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000096, id=093053f4-2142-487d-9db6-8b83c9b91ed5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.221 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.307 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.582 187156 DEBUG nova.compute.manager [req-dbc4de7e-1d90-48a7-96ce-290d292c6c96 req-be09acd8-e462-4282-ae81-2b453e6c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.583 187156 DEBUG oslo_concurrency.lockutils [req-dbc4de7e-1d90-48a7-96ce-290d292c6c96 req-be09acd8-e462-4282-ae81-2b453e6c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.583 187156 DEBUG oslo_concurrency.lockutils [req-dbc4de7e-1d90-48a7-96ce-290d292c6c96 req-be09acd8-e462-4282-ae81-2b453e6c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.583 187156 DEBUG oslo_concurrency.lockutils [req-dbc4de7e-1d90-48a7-96ce-290d292c6c96 req-be09acd8-e462-4282-ae81-2b453e6c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.583 187156 DEBUG nova.compute.manager [req-dbc4de7e-1d90-48a7-96ce-290d292c6c96 req-be09acd8-e462-4282-ae81-2b453e6c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:48 np0005539504 nova_compute[187152]: 2025-11-29 07:36:48.584 187156 WARNING nova.compute.manager [req-dbc4de7e-1d90-48a7-96ce-290d292c6c96 req-be09acd8-e462-4282-ae81-2b453e6c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state suspended and task_state resuming.#033[00m
Nov 29 02:36:48 np0005539504 podman[243461]: 2025-11-29 07:36:48.740262473 +0000 UTC m=+0.068182473 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute)
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.238 187156 DEBUG nova.network.neutron [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updating instance_info_cache with network_info: [{"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.255 187156 DEBUG oslo_concurrency.lockutils [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Releasing lock "refresh_cache-093053f4-2142-487d-9db6-8b83c9b91ed5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.262 187156 DEBUG nova.virt.libvirt.vif [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-684634725',display_name='tempest-TestServerAdvancedOps-server-684634725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-684634725',id=150,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:36:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='a82f08b594db4c92b19594b91420a641',ramdisk_id='',reservation_id='r-9dxnjqtu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-403162681',owner_user_name='tempest-TestServerAdvancedOps-403162681-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:36:43Z,user_data=None,user_id='0db44d5b07ab4caf927626f539adc8cc',uuid=093053f4-2142-487d-9db6-8b83c9b91ed5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.263 187156 DEBUG nova.network.os_vif_util [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converting VIF {"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.264 187156 DEBUG nova.network.os_vif_util [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.264 187156 DEBUG os_vif [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.265 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.266 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.273 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40ceff87-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.274 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap40ceff87-ff, col_values=(('external_ids', {'iface-id': '40ceff87-ff89-4058-a1a5-020625a1887a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:f0:96', 'vm-uuid': '093053f4-2142-487d-9db6-8b83c9b91ed5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.274 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.275 187156 INFO os_vif [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff')#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.311 187156 DEBUG nova.objects.instance [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'numa_topology' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:50 np0005539504 kernel: tap40ceff87-ff: entered promiscuous mode
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.419 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:50Z|00616|binding|INFO|Claiming lport 40ceff87-ff89-4058-a1a5-020625a1887a for this chassis.
Nov 29 02:36:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:50Z|00617|binding|INFO|40ceff87-ff89-4058-a1a5-020625a1887a: Claiming fa:16:3e:d5:f0:96 10.100.0.2
Nov 29 02:36:50 np0005539504 NetworkManager[55210]: <info>  [1764401810.4216] manager: (tap40ceff87-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Nov 29 02:36:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:50.429 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:f0:96 10.100.0.2'], port_security=['fa:16:3e:d5:f0:96 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf016381-209e-4fcd-a155-acc952170a94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82f08b594db4c92b19594b91420a641', 'neutron:revision_number': '7', 'neutron:security_group_ids': '04c157ee-9518-4c86-aa3d-4298341f48c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80f72a82-a5d9-4a82-9318-66b679841586, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=40ceff87-ff89-4058-a1a5-020625a1887a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:50.431 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 40ceff87-ff89-4058-a1a5-020625a1887a in datapath cf016381-209e-4fcd-a155-acc952170a94 bound to our chassis#033[00m
Nov 29 02:36:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:50.432 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf016381-209e-4fcd-a155-acc952170a94 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:36:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:50Z|00618|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a ovn-installed in OVS
Nov 29 02:36:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:50Z|00619|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a up in Southbound
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.432 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:50 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:50.434 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[43c16fd2-850c-437f-94ea-cf9d706920d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:36:50 np0005539504 systemd-udevd[243496]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:36:50 np0005539504 systemd-machined[153423]: New machine qemu-80-instance-00000096.
Nov 29 02:36:50 np0005539504 NetworkManager[55210]: <info>  [1764401810.4679] device (tap40ceff87-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:36:50 np0005539504 NetworkManager[55210]: <info>  [1764401810.4691] device (tap40ceff87-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:36:50 np0005539504 systemd[1]: Started Virtual Machine qemu-80-instance-00000096.
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.695 187156 DEBUG nova.compute.manager [req-720b4445-59bd-4b38-8e9f-5f162507ed78 req-5b16d89d-82f3-48c8-b993-f2bdc36dedfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.696 187156 DEBUG oslo_concurrency.lockutils [req-720b4445-59bd-4b38-8e9f-5f162507ed78 req-5b16d89d-82f3-48c8-b993-f2bdc36dedfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.696 187156 DEBUG oslo_concurrency.lockutils [req-720b4445-59bd-4b38-8e9f-5f162507ed78 req-5b16d89d-82f3-48c8-b993-f2bdc36dedfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.697 187156 DEBUG oslo_concurrency.lockutils [req-720b4445-59bd-4b38-8e9f-5f162507ed78 req-5b16d89d-82f3-48c8-b993-f2bdc36dedfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.697 187156 DEBUG nova.compute.manager [req-720b4445-59bd-4b38-8e9f-5f162507ed78 req-5b16d89d-82f3-48c8-b993-f2bdc36dedfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:50 np0005539504 nova_compute[187152]: 2025-11-29 07:36:50.697 187156 WARNING nova.compute.manager [req-720b4445-59bd-4b38-8e9f-5f162507ed78 req-5b16d89d-82f3-48c8-b993-f2bdc36dedfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state suspended and task_state resuming.#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.194 187156 DEBUG nova.virt.libvirt.host [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Removed pending event for 093053f4-2142-487d-9db6-8b83c9b91ed5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.195 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401811.1938303, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.195 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Started (Lifecycle Event)#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.221 187156 DEBUG nova.compute.manager [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.221 187156 DEBUG nova.objects.instance [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.242 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.250 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.255 187156 INFO nova.virt.libvirt.driver [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance running successfully.#033[00m
Nov 29 02:36:51 np0005539504 virtqemud[186569]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.259 187156 DEBUG nova.virt.libvirt.guest [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.259 187156 DEBUG nova.compute.manager [None req-91be893b-f329-44e3-a19f-4744ae861dc0 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.285 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.286 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401811.1989393, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.286 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.324 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:51 np0005539504 nova_compute[187152]: 2025-11-29 07:36:51.329 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.227 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.383 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.384 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.385 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.385 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.385 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.398 187156 INFO nova.compute.manager [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Terminating instance#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.410 187156 DEBUG nova.compute.manager [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:36:52 np0005539504 kernel: tap40ceff87-ff (unregistering): left promiscuous mode
Nov 29 02:36:52 np0005539504 NetworkManager[55210]: <info>  [1764401812.4340] device (tap40ceff87-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:36:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:52Z|00620|binding|INFO|Releasing lport 40ceff87-ff89-4058-a1a5-020625a1887a from this chassis (sb_readonly=0)
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.444 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:52Z|00621|binding|INFO|Setting lport 40ceff87-ff89-4058-a1a5-020625a1887a down in Southbound
Nov 29 02:36:52 np0005539504 ovn_controller[95182]: 2025-11-29T07:36:52Z|00622|binding|INFO|Removing iface tap40ceff87-ff ovn-installed in OVS
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.447 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:52.452 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:f0:96 10.100.0.2'], port_security=['fa:16:3e:d5:f0:96 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '093053f4-2142-487d-9db6-8b83c9b91ed5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf016381-209e-4fcd-a155-acc952170a94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a82f08b594db4c92b19594b91420a641', 'neutron:revision_number': '8', 'neutron:security_group_ids': '04c157ee-9518-4c86-aa3d-4298341f48c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80f72a82-a5d9-4a82-9318-66b679841586, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=40ceff87-ff89-4058-a1a5-020625a1887a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:52.453 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 40ceff87-ff89-4058-a1a5-020625a1887a in datapath cf016381-209e-4fcd-a155-acc952170a94 unbound from our chassis#033[00m
Nov 29 02:36:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:52.454 104164 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf016381-209e-4fcd-a155-acc952170a94 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 29 02:36:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:36:52.455 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8f8566-8294-4174-aa90-9c37b8923201]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.465 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539504 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000096.scope: Deactivated successfully.
Nov 29 02:36:52 np0005539504 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000096.scope: Consumed 1.486s CPU time.
Nov 29 02:36:52 np0005539504 systemd-machined[153423]: Machine qemu-80-instance-00000096 terminated.
Nov 29 02:36:52 np0005539504 NetworkManager[55210]: <info>  [1764401812.6362] manager: (tap40ceff87-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.677 187156 INFO nova.virt.libvirt.driver [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Instance destroyed successfully.#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.677 187156 DEBUG nova.objects.instance [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lazy-loading 'resources' on Instance uuid 093053f4-2142-487d-9db6-8b83c9b91ed5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.690 187156 DEBUG nova.virt.libvirt.vif [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-684634725',display_name='tempest-TestServerAdvancedOps-server-684634725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-684634725',id=150,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:36:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a82f08b594db4c92b19594b91420a641',ramdisk_id='',reservation_id='r-9dxnjqtu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-403162681',owner_user_name='tempest-TestServerAdvancedOps-403162681-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:36:51Z,user_data=None,user_id='0db44d5b07ab4caf927626f539adc8cc',uuid=093053f4-2142-487d-9db6-8b83c9b91ed5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.691 187156 DEBUG nova.network.os_vif_util [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converting VIF {"id": "40ceff87-ff89-4058-a1a5-020625a1887a", "address": "fa:16:3e:d5:f0:96", "network": {"id": "cf016381-209e-4fcd-a155-acc952170a94", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-529514873-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a82f08b594db4c92b19594b91420a641", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap40ceff87-ff", "ovs_interfaceid": "40ceff87-ff89-4058-a1a5-020625a1887a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.691 187156 DEBUG nova.network.os_vif_util [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.692 187156 DEBUG os_vif [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.694 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.694 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40ceff87-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.737 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.740 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.743 187156 INFO os_vif [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=40ceff87-ff89-4058-a1a5-020625a1887a,network=Network(cf016381-209e-4fcd-a155-acc952170a94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap40ceff87-ff')#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.744 187156 INFO nova.virt.libvirt.driver [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Deleting instance files /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5_del#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.745 187156 INFO nova.virt.libvirt.driver [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Deletion of /var/lib/nova/instances/093053f4-2142-487d-9db6-8b83c9b91ed5_del complete#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.797 187156 DEBUG nova.compute.manager [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.798 187156 DEBUG oslo_concurrency.lockutils [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.798 187156 DEBUG oslo_concurrency.lockutils [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.798 187156 DEBUG oslo_concurrency.lockutils [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.798 187156 DEBUG nova.compute.manager [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.799 187156 WARNING nova.compute.manager [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.817 187156 INFO nova.compute.manager [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.818 187156 DEBUG oslo.service.loopingcall [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.818 187156 DEBUG nova.compute.manager [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:36:52 np0005539504 nova_compute[187152]: 2025-11-29 07:36:52.818 187156 DEBUG nova.network.neutron [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:36:53 np0005539504 nova_compute[187152]: 2025-11-29 07:36:53.860 187156 DEBUG nova.network.neutron [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:53 np0005539504 nova_compute[187152]: 2025-11-29 07:36:53.894 187156 INFO nova.compute.manager [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Took 1.08 seconds to deallocate network for instance.#033[00m
Nov 29 02:36:53 np0005539504 nova_compute[187152]: 2025-11-29 07:36:53.961 187156 DEBUG nova.compute.manager [req-34181e45-e2b3-4349-83fd-4d5bd00082b8 req-4ab4f3dc-ce04-4760-a52b-d85df2cac721 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-deleted-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.589 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.590 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.648 187156 DEBUG nova.compute.provider_tree [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.672 187156 DEBUG nova.scheduler.client.report [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.693 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.784 187156 INFO nova.scheduler.client.report [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Deleted allocations for instance 093053f4-2142-487d-9db6-8b83c9b91ed5#033[00m
Nov 29 02:36:54 np0005539504 nova_compute[187152]: 2025-11-29 07:36:54.918 187156 DEBUG oslo_concurrency.lockutils [None req-9775a8fd-d877-429e-8e37-74ad9c23bad6 0db44d5b07ab4caf927626f539adc8cc a82f08b594db4c92b19594b91420a641 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.131 187156 DEBUG nova.compute.manager [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.132 187156 DEBUG oslo_concurrency.lockutils [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.132 187156 DEBUG oslo_concurrency.lockutils [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.132 187156 DEBUG oslo_concurrency.lockutils [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.133 187156 DEBUG nova.compute.manager [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.133 187156 WARNING nova.compute.manager [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-unplugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.133 187156 DEBUG nova.compute.manager [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.133 187156 DEBUG oslo_concurrency.lockutils [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.134 187156 DEBUG oslo_concurrency.lockutils [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.134 187156 DEBUG oslo_concurrency.lockutils [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093053f4-2142-487d-9db6-8b83c9b91ed5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.134 187156 DEBUG nova.compute.manager [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] No waiting events found dispatching network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:55 np0005539504 nova_compute[187152]: 2025-11-29 07:36:55.135 187156 WARNING nova.compute.manager [req-a812a1bf-e677-4182-a6b0-335054657f26 req-51b17f33-9603-46c2-b783-ffc1785d1293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Received unexpected event network-vif-plugged-40ceff87-ff89-4058-a1a5-020625a1887a for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:36:55 np0005539504 podman[243547]: 2025-11-29 07:36:55.731295546 +0000 UTC m=+0.061540233 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 02:36:57 np0005539504 nova_compute[187152]: 2025-11-29 07:36:57.230 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:57 np0005539504 nova_compute[187152]: 2025-11-29 07:36:57.738 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:59 np0005539504 nova_compute[187152]: 2025-11-29 07:36:59.745 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:02 np0005539504 nova_compute[187152]: 2025-11-29 07:37:02.233 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:02 np0005539504 nova_compute[187152]: 2025-11-29 07:37:02.740 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:06 np0005539504 podman[243570]: 2025-11-29 07:37:06.71755987 +0000 UTC m=+0.055636162 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:37:06 np0005539504 podman[243572]: 2025-11-29 07:37:06.719750939 +0000 UTC m=+0.051016356 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:37:06 np0005539504 podman[243571]: 2025-11-29 07:37:06.721816765 +0000 UTC m=+0.057938445 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 29 02:37:07 np0005539504 nova_compute[187152]: 2025-11-29 07:37:07.236 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:07 np0005539504 nova_compute[187152]: 2025-11-29 07:37:07.675 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401812.673954, 093053f4-2142-487d-9db6-8b83c9b91ed5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:37:07 np0005539504 nova_compute[187152]: 2025-11-29 07:37:07.676 187156 INFO nova.compute.manager [-] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:37:07 np0005539504 nova_compute[187152]: 2025-11-29 07:37:07.708 187156 DEBUG nova.compute.manager [None req-d2a1e365-3229-49b7-9ae0-211b2f404219 - - - - - -] [instance: 093053f4-2142-487d-9db6-8b83c9b91ed5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:37:07 np0005539504 nova_compute[187152]: 2025-11-29 07:37:07.742 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:08 np0005539504 nova_compute[187152]: 2025-11-29 07:37:08.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:11 np0005539504 podman[243630]: 2025-11-29 07:37:11.715492859 +0000 UTC m=+0.058674814 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:37:11 np0005539504 podman[243631]: 2025-11-29 07:37:11.760158342 +0000 UTC m=+0.096493481 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:37:11 np0005539504 nova_compute[187152]: 2025-11-29 07:37:11.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:12 np0005539504 nova_compute[187152]: 2025-11-29 07:37:12.236 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:12 np0005539504 nova_compute[187152]: 2025-11-29 07:37:12.745 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:13.528 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:37:13 np0005539504 nova_compute[187152]: 2025-11-29 07:37:13.528 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:13 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:13.530 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:37:17 np0005539504 nova_compute[187152]: 2025-11-29 07:37:17.239 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:17 np0005539504 nova_compute[187152]: 2025-11-29 07:37:17.747 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:17 np0005539504 nova_compute[187152]: 2025-11-29 07:37:17.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:19 np0005539504 podman[243684]: 2025-11-29 07:37:19.727367513 +0000 UTC m=+0.073467925 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:37:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:20.533 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:22 np0005539504 nova_compute[187152]: 2025-11-29 07:37:22.243 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:22 np0005539504 nova_compute[187152]: 2025-11-29 07:37:22.749 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:22 np0005539504 nova_compute[187152]: 2025-11-29 07:37:22.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:22.984 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:22.985 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:22.985 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:23 np0005539504 nova_compute[187152]: 2025-11-29 07:37:23.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:23 np0005539504 nova_compute[187152]: 2025-11-29 07:37:23.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:37:24 np0005539504 nova_compute[187152]: 2025-11-29 07:37:24.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:25 np0005539504 nova_compute[187152]: 2025-11-29 07:37:25.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:25 np0005539504 nova_compute[187152]: 2025-11-29 07:37:25.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:37:25 np0005539504 nova_compute[187152]: 2025-11-29 07:37:25.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:37:26 np0005539504 nova_compute[187152]: 2025-11-29 07:37:26.110 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:37:26 np0005539504 podman[243704]: 2025-11-29 07:37:26.721217444 +0000 UTC m=+0.060888245 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 02:37:26 np0005539504 nova_compute[187152]: 2025-11-29 07:37:26.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.245 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.457 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.457 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.458 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.458 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.635 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.637 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5667MB free_disk=73.07466125488281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.637 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.637 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:27 np0005539504 nova_compute[187152]: 2025-11-29 07:37:27.790 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:29 np0005539504 nova_compute[187152]: 2025-11-29 07:37:29.066 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:37:29 np0005539504 nova_compute[187152]: 2025-11-29 07:37:29.066 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:37:29 np0005539504 nova_compute[187152]: 2025-11-29 07:37:29.151 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:37:29 np0005539504 nova_compute[187152]: 2025-11-29 07:37:29.404 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:37:30 np0005539504 nova_compute[187152]: 2025-11-29 07:37:30.169 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:37:30 np0005539504 nova_compute[187152]: 2025-11-29 07:37:30.169 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:32 np0005539504 nova_compute[187152]: 2025-11-29 07:37:32.247 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:32 np0005539504 nova_compute[187152]: 2025-11-29 07:37:32.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:33 np0005539504 nova_compute[187152]: 2025-11-29 07:37:33.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:33 np0005539504 nova_compute[187152]: 2025-11-29 07:37:33.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:33 np0005539504 nova_compute[187152]: 2025-11-29 07:37:33.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:37:34 np0005539504 nova_compute[187152]: 2025-11-29 07:37:34.389 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:37 np0005539504 nova_compute[187152]: 2025-11-29 07:37:37.249 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539504 podman[243729]: 2025-11-29 07:37:37.724614793 +0000 UTC m=+0.058548681 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:37:37 np0005539504 podman[243727]: 2025-11-29 07:37:37.725138898 +0000 UTC m=+0.065552732 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:37:37 np0005539504 podman[243728]: 2025-11-29 07:37:37.732399895 +0000 UTC m=+0.070687121 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 29 02:37:37 np0005539504 nova_compute[187152]: 2025-11-29 07:37:37.795 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:42 np0005539504 nova_compute[187152]: 2025-11-29 07:37:42.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:42 np0005539504 podman[243788]: 2025-11-29 07:37:42.711476402 +0000 UTC m=+0.055322715 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:37:42 np0005539504 nova_compute[187152]: 2025-11-29 07:37:42.797 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:42 np0005539504 podman[243789]: 2025-11-29 07:37:42.804704864 +0000 UTC m=+0.142172433 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:37:47 np0005539504 nova_compute[187152]: 2025-11-29 07:37:47.253 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:47 np0005539504 nova_compute[187152]: 2025-11-29 07:37:47.800 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:48 np0005539504 ovn_controller[95182]: 2025-11-29T07:37:48Z|00623|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 29 02:37:50 np0005539504 podman[243838]: 2025-11-29 07:37:50.732996429 +0000 UTC m=+0.072106419 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:37:52 np0005539504 nova_compute[187152]: 2025-11-29 07:37:52.255 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:52 np0005539504 nova_compute[187152]: 2025-11-29 07:37:52.840 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:56.434 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:37:56 np0005539504 nova_compute[187152]: 2025-11-29 07:37:56.435 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:37:56.436 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:37:57 np0005539504 nova_compute[187152]: 2025-11-29 07:37:57.257 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:57 np0005539504 podman[243859]: 2025-11-29 07:37:57.728846514 +0000 UTC m=+0.072134951 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:37:57 np0005539504 nova_compute[187152]: 2025-11-29 07:37:57.843 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:02 np0005539504 nova_compute[187152]: 2025-11-29 07:38:02.295 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:02 np0005539504 nova_compute[187152]: 2025-11-29 07:38:02.846 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:02 np0005539504 nova_compute[187152]: 2025-11-29 07:38:02.909 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:02 np0005539504 nova_compute[187152]: 2025-11-29 07:38:02.910 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:03 np0005539504 nova_compute[187152]: 2025-11-29 07:38:03.477 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:38:03 np0005539504 nova_compute[187152]: 2025-11-29 07:38:03.677 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:03 np0005539504 nova_compute[187152]: 2025-11-29 07:38:03.677 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:03 np0005539504 nova_compute[187152]: 2025-11-29 07:38:03.688 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:38:03 np0005539504 nova_compute[187152]: 2025-11-29 07:38:03.689 187156 INFO nova.compute.claims [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:38:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:04.440 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.526 187156 DEBUG nova.compute.provider_tree [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.555 187156 DEBUG nova.scheduler.client.report [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.585 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.586 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.639 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.640 187156 DEBUG nova.network.neutron [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.654 187156 INFO nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.674 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.783 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.785 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.786 187156 INFO nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Creating image(s)#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.787 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "/var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.787 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "/var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.788 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "/var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.806 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.885 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.887 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.887 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.899 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.960 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:04 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.961 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:04.999 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.001 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.001 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.057 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.058 187156 DEBUG nova.virt.disk.api [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Checking if we can resize image /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.058 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.127 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.128 187156 DEBUG nova.virt.disk.api [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Cannot resize image /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.129 187156 DEBUG nova.objects.instance [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lazy-loading 'migration_context' on Instance uuid 37b4277c-4278-4837-a7fa-e4ef827f1078 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.145 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.146 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Ensure instance console log exists: /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.146 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.146 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.147 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.319 187156 DEBUG nova.policy [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.700 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.701 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:38:05 np0005539504 nova_compute[187152]: 2025-11-29 07:38:05.721 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:38:07 np0005539504 nova_compute[187152]: 2025-11-29 07:38:07.297 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:07 np0005539504 nova_compute[187152]: 2025-11-29 07:38:07.888 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:08 np0005539504 podman[243895]: 2025-11-29 07:38:08.735577982 +0000 UTC m=+0.063505626 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:38:08 np0005539504 podman[243896]: 2025-11-29 07:38:08.738894582 +0000 UTC m=+0.064151804 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:38:08 np0005539504 podman[243897]: 2025-11-29 07:38:08.749437398 +0000 UTC m=+0.072127460 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:38:08 np0005539504 nova_compute[187152]: 2025-11-29 07:38:08.959 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:09 np0005539504 nova_compute[187152]: 2025-11-29 07:38:09.389 187156 DEBUG nova.network.neutron [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Successfully created port: 541e9918-b113-477f-b173-6e0844275c91 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:38:11 np0005539504 nova_compute[187152]: 2025-11-29 07:38:11.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.245 187156 DEBUG nova.network.neutron [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Successfully updated port: 541e9918-b113-477f-b173-6e0844275c91 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.289 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.290 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquired lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.290 187156 DEBUG nova.network.neutron [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.301 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.429 187156 DEBUG nova.compute.manager [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-changed-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.430 187156 DEBUG nova.compute.manager [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Refreshing instance network info cache due to event network-changed-541e9918-b113-477f-b173-6e0844275c91. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.430 187156 DEBUG oslo_concurrency.lockutils [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.466 187156 DEBUG nova.network.neutron [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:38:12 np0005539504 nova_compute[187152]: 2025-11-29 07:38:12.891 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:13 np0005539504 podman[243956]: 2025-11-29 07:38:13.722175623 +0000 UTC m=+0.060902504 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:38:13 np0005539504 podman[243957]: 2025-11-29 07:38:13.775162272 +0000 UTC m=+0.103731968 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.526 187156 DEBUG nova.network.neutron [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.560 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Releasing lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.561 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Instance network_info: |[{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.562 187156 DEBUG oslo_concurrency.lockutils [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.563 187156 DEBUG nova.network.neutron [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Refreshing network info cache for port 541e9918-b113-477f-b173-6e0844275c91 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.567 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Start _get_guest_xml network_info=[{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.574 187156 WARNING nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.583 187156 DEBUG nova.virt.libvirt.host [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.585 187156 DEBUG nova.virt.libvirt.host [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.593 187156 DEBUG nova.virt.libvirt.host [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.594 187156 DEBUG nova.virt.libvirt.host [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.595 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.596 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.596 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.596 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.597 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.597 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.597 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.597 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.597 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.598 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.598 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.598 187156 DEBUG nova.virt.hardware [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.603 187156 DEBUG nova.virt.libvirt.vif [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1621430489',display_name='tempest-TestSnapshotPattern-server-1621430489',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1621430489',id=153,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGeGkco1TqFqMrIhoEE73ZhDAmJJAptCCvJ7Qwh9Z7UI8dpQ1Egx0vm1HBJKnTjax+KcM/ISl5nzEEt0JBuJsVC3CZ+KIvxrsxwf2GDJ8t8s6c8ZHa76XQJmPLVjwVs32w==',key_name='tempest-TestSnapshotPattern-1847051545',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='66de5f0713944c28a20250b7fbccc130',ramdisk_id='',reservation_id='r-q3ckjdh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-737443016',owner_user_name='tempest-TestSnapshotPattern-737443016-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:04Z,user_data=None,user_id='b921ad1454834bae9b706b9fa53948b3',uuid=37b4277c-4278-4837-a7fa-e4ef827f1078,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.603 187156 DEBUG nova.network.os_vif_util [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converting VIF {"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.604 187156 DEBUG nova.network.os_vif_util [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.605 187156 DEBUG nova.objects.instance [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lazy-loading 'pci_devices' on Instance uuid 37b4277c-4278-4837-a7fa-e4ef827f1078 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.622 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <uuid>37b4277c-4278-4837-a7fa-e4ef827f1078</uuid>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <name>instance-00000099</name>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestSnapshotPattern-server-1621430489</nova:name>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:38:14</nova:creationTime>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:user uuid="b921ad1454834bae9b706b9fa53948b3">tempest-TestSnapshotPattern-737443016-project-member</nova:user>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:project uuid="66de5f0713944c28a20250b7fbccc130">tempest-TestSnapshotPattern-737443016</nova:project>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        <nova:port uuid="541e9918-b113-477f-b173-6e0844275c91">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <entry name="serial">37b4277c-4278-4837-a7fa-e4ef827f1078</entry>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <entry name="uuid">37b4277c-4278-4837-a7fa-e4ef827f1078</entry>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.config"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:d4:d8:ce"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <target dev="tap541e9918-b1"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/console.log" append="off"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:38:14 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:38:14 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:38:14 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:38:14 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.624 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Preparing to wait for external event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.624 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.624 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.624 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.625 187156 DEBUG nova.virt.libvirt.vif [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1621430489',display_name='tempest-TestSnapshotPattern-server-1621430489',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1621430489',id=153,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGeGkco1TqFqMrIhoEE73ZhDAmJJAptCCvJ7Qwh9Z7UI8dpQ1Egx0vm1HBJKnTjax+KcM/ISl5nzEEt0JBuJsVC3CZ+KIvxrsxwf2GDJ8t8s6c8ZHa76XQJmPLVjwVs32w==',key_name='tempest-TestSnapshotPattern-1847051545',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='66de5f0713944c28a20250b7fbccc130',ramdisk_id='',reservation_id='r-q3ckjdh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-737443016',owner_user_name='tempest-TestSnapshotPattern-737443016-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:04Z,user_data=None,user_id='b921ad1454834bae9b706b9fa53948b3',uuid=37b4277c-4278-4837-a7fa-e4ef827f1078,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.625 187156 DEBUG nova.network.os_vif_util [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converting VIF {"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.626 187156 DEBUG nova.network.os_vif_util [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.626 187156 DEBUG os_vif [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.627 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.627 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.628 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.630 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap541e9918-b1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.630 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap541e9918-b1, col_values=(('external_ids', {'iface-id': '541e9918-b113-477f-b173-6e0844275c91', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:d8:ce', 'vm-uuid': '37b4277c-4278-4837-a7fa-e4ef827f1078'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.632 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:14 np0005539504 NetworkManager[55210]: <info>  [1764401894.6340] manager: (tap541e9918-b1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.634 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.641 187156 INFO os_vif [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1')#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.691 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.691 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.692 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] No VIF found with MAC fa:16:3e:d4:d8:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:38:14 np0005539504 nova_compute[187152]: 2025-11-29 07:38:14.692 187156 INFO nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Using config drive#033[00m
Nov 29 02:38:15 np0005539504 nova_compute[187152]: 2025-11-29 07:38:15.557 187156 INFO nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Creating config drive at /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.config#033[00m
Nov 29 02:38:15 np0005539504 nova_compute[187152]: 2025-11-29 07:38:15.564 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz4cf9894 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:15 np0005539504 nova_compute[187152]: 2025-11-29 07:38:15.694 187156 DEBUG oslo_concurrency.processutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz4cf9894" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:15 np0005539504 kernel: tap541e9918-b1: entered promiscuous mode
Nov 29 02:38:15 np0005539504 NetworkManager[55210]: <info>  [1764401895.7834] manager: (tap541e9918-b1): new Tun device (/org/freedesktop/NetworkManager/Devices/274)
Nov 29 02:38:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:15Z|00624|binding|INFO|Claiming lport 541e9918-b113-477f-b173-6e0844275c91 for this chassis.
Nov 29 02:38:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:15Z|00625|binding|INFO|541e9918-b113-477f-b173-6e0844275c91: Claiming fa:16:3e:d4:d8:ce 10.100.0.12
Nov 29 02:38:15 np0005539504 nova_compute[187152]: 2025-11-29 07:38:15.785 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:15 np0005539504 systemd-udevd[244028]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:38:15 np0005539504 systemd-machined[153423]: New machine qemu-81-instance-00000099.
Nov 29 02:38:15 np0005539504 NetworkManager[55210]: <info>  [1764401895.8275] device (tap541e9918-b1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:38:15 np0005539504 NetworkManager[55210]: <info>  [1764401895.8285] device (tap541e9918-b1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.836 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:d8:ce 10.100.0.12'], port_security=['fa:16:3e:d4:d8:ce 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19f12258-2ca1-4bcd-90a1-babd862276cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '66de5f0713944c28a20250b7fbccc130', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd9016a5f-cca6-4f8a-be77-5b7f8d76145d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e91bbf94-b791-4827-92dd-47385d8f6f11, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=541e9918-b113-477f-b173-6e0844275c91) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.840 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 541e9918-b113-477f-b173-6e0844275c91 in datapath 19f12258-2ca1-4bcd-90a1-babd862276cb bound to our chassis#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.841 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19f12258-2ca1-4bcd-90a1-babd862276cb#033[00m
Nov 29 02:38:15 np0005539504 systemd[1]: Started Virtual Machine qemu-81-instance-00000099.
Nov 29 02:38:15 np0005539504 nova_compute[187152]: 2025-11-29 07:38:15.847 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:15Z|00626|binding|INFO|Setting lport 541e9918-b113-477f-b173-6e0844275c91 ovn-installed in OVS
Nov 29 02:38:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:15Z|00627|binding|INFO|Setting lport 541e9918-b113-477f-b173-6e0844275c91 up in Southbound
Nov 29 02:38:15 np0005539504 nova_compute[187152]: 2025-11-29 07:38:15.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.858 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5d3eec-c907-4bed-9ba6-5f40e48fd1ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.859 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19f12258-21 in ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.862 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19f12258-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.862 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4038beac-8e4e-4469-b7b3-4b85169672b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.863 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[20694b45-2de1-4a81-80da-7d3bc06941d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.876 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[03144942-56fb-4584-8dd7-5a7b7de65e86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.894 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6398a8d7-a20a-4527-bb0d-b761dd0f61f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.942 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f62b1a14-9d15-431d-8c9a-a4bd299da408]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.952 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2bfb52a5-de0a-41a9-bb53-d6f24582cbac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:15 np0005539504 NetworkManager[55210]: <info>  [1764401895.9540] manager: (tap19f12258-20): new Veth device (/org/freedesktop/NetworkManager/Devices/275)
Nov 29 02:38:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:15.996 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4c655f4c-fa51-4d6b-b6ea-bb7ac5115426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.002 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[c5662687-6902-43e4-a21d-a3235b5a50e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 NetworkManager[55210]: <info>  [1764401896.0275] device (tap19f12258-20): carrier: link connected
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.033 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[7470ce04-a63f-4702-85f8-445b67a6b110]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.055 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[01c2dd05-70be-49b0-ac62-02387e7d4c2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19f12258-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:47:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735890, 'reachable_time': 35229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244062, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.073 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf6df82-aafa-413f-926a-71bc0928d021]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:47dc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 735890, 'tstamp': 735890}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244063, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.097 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6aca0456-c951-4b62-825a-94d4190f98bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19f12258-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:47:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735890, 'reachable_time': 35229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244064, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.131 187156 DEBUG nova.compute.manager [req-a3773139-e411-4080-b31c-90fa1297917c req-182628c7-eda3-4cbe-be63-44daceae5064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.133 187156 DEBUG oslo_concurrency.lockutils [req-a3773139-e411-4080-b31c-90fa1297917c req-182628c7-eda3-4cbe-be63-44daceae5064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.133 187156 DEBUG oslo_concurrency.lockutils [req-a3773139-e411-4080-b31c-90fa1297917c req-182628c7-eda3-4cbe-be63-44daceae5064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.133 187156 DEBUG oslo_concurrency.lockutils [req-a3773139-e411-4080-b31c-90fa1297917c req-182628c7-eda3-4cbe-be63-44daceae5064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.134 187156 DEBUG nova.compute.manager [req-a3773139-e411-4080-b31c-90fa1297917c req-182628c7-eda3-4cbe-be63-44daceae5064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Processing event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.139 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[27e5c79b-e70a-4d41-b984-454e15e34663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.204 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[8aaf8d26-f71e-4454-a6b5-03485772e14c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.207 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19f12258-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.207 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.207 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19f12258-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.209 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:16 np0005539504 NetworkManager[55210]: <info>  [1764401896.2106] manager: (tap19f12258-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Nov 29 02:38:16 np0005539504 kernel: tap19f12258-20: entered promiscuous mode
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.213 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19f12258-20, col_values=(('external_ids', {'iface-id': 'be18324a-44e1-4916-a63f-b1a7efeb6fb9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.214 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:16 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:16Z|00628|binding|INFO|Releasing lport be18324a-44e1-4916-a63f-b1a7efeb6fb9 from this chassis (sb_readonly=0)
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.215 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.216 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19f12258-2ca1-4bcd-90a1-babd862276cb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19f12258-2ca1-4bcd-90a1-babd862276cb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.217 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b8de1235-6452-4a9b-b419-64f2fffe8519]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.218 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-19f12258-2ca1-4bcd-90a1-babd862276cb
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/19f12258-2ca1-4bcd-90a1-babd862276cb.pid.haproxy
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 19f12258-2ca1-4bcd-90a1-babd862276cb
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:38:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:16.219 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'env', 'PROCESS_TAG=haproxy-19f12258-2ca1-4bcd-90a1-babd862276cb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19f12258-2ca1-4bcd-90a1-babd862276cb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.228 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.350 187156 DEBUG nova.network.neutron [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updated VIF entry in instance network info cache for port 541e9918-b113-477f-b173-6e0844275c91. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.361 187156 DEBUG nova.network.neutron [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.400 187156 DEBUG oslo_concurrency.lockutils [req-2991f2bf-073e-4398-8d6d-e702e6c6d5e4 req-7522b903-092a-4065-8ed2-618782025638 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.600 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401896.5998192, 37b4277c-4278-4837-a7fa-e4ef827f1078 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.603 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] VM Started (Lifecycle Event)#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.606 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.622 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.628 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.630 187156 INFO nova.virt.libvirt.driver [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Instance spawned successfully.#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.631 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.635 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.654 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.655 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401896.5999403, 37b4277c-4278-4837-a7fa-e4ef827f1078 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.655 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.660 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:38:16 np0005539504 podman[244102]: 2025-11-29 07:38:16.660784362 +0000 UTC m=+0.079682365 container create 0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.660 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.661 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.662 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.662 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.663 187156 DEBUG nova.virt.libvirt.driver [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.686 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.699 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401896.6096263, 37b4277c-4278-4837-a7fa-e4ef827f1078 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.700 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:38:16 np0005539504 podman[244102]: 2025-11-29 07:38:16.609461258 +0000 UTC m=+0.028359291 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:38:16 np0005539504 systemd[1]: Started libpod-conmon-0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57.scope.
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.725 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.731 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.749 187156 INFO nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Took 11.96 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.750 187156 DEBUG nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.759 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:38:16 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:38:16 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d0f9edcd0e869e0776ba229916bb15c4d4a6e8dfe81afb766a9693ccf02ca88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:38:16 np0005539504 podman[244102]: 2025-11-29 07:38:16.787428341 +0000 UTC m=+0.206326364 container init 0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:38:16 np0005539504 podman[244102]: 2025-11-29 07:38:16.798673176 +0000 UTC m=+0.217571179 container start 0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:38:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [NOTICE]   (244122) : New worker (244124) forked
Nov 29 02:38:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [NOTICE]   (244122) : Loading success.
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.843 187156 INFO nova.compute.manager [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Took 13.24 seconds to build instance.#033[00m
Nov 29 02:38:16 np0005539504 nova_compute[187152]: 2025-11-29 07:38:16.901 187156 DEBUG oslo_concurrency.lockutils [None req-b6085d46-f33e-4ee2-a0c9-752c780c0892 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:17 np0005539504 nova_compute[187152]: 2025-11-29 07:38:17.341 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.400 187156 DEBUG nova.compute.manager [req-1e2c82fd-27bf-4f09-aacc-2e3cca036aaa req-b92024c8-af0e-4def-8fed-481c0f7c1a1a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.401 187156 DEBUG oslo_concurrency.lockutils [req-1e2c82fd-27bf-4f09-aacc-2e3cca036aaa req-b92024c8-af0e-4def-8fed-481c0f7c1a1a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.401 187156 DEBUG oslo_concurrency.lockutils [req-1e2c82fd-27bf-4f09-aacc-2e3cca036aaa req-b92024c8-af0e-4def-8fed-481c0f7c1a1a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.402 187156 DEBUG oslo_concurrency.lockutils [req-1e2c82fd-27bf-4f09-aacc-2e3cca036aaa req-b92024c8-af0e-4def-8fed-481c0f7c1a1a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.402 187156 DEBUG nova.compute.manager [req-1e2c82fd-27bf-4f09-aacc-2e3cca036aaa req-b92024c8-af0e-4def-8fed-481c0f7c1a1a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] No waiting events found dispatching network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.403 187156 WARNING nova.compute.manager [req-1e2c82fd-27bf-4f09-aacc-2e3cca036aaa req-b92024c8-af0e-4def-8fed-481c0f7c1a1a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received unexpected event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:38:18 np0005539504 nova_compute[187152]: 2025-11-29 07:38:18.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:19 np0005539504 nova_compute[187152]: 2025-11-29 07:38:19.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:21 np0005539504 podman[244133]: 2025-11-29 07:38:21.771072183 +0000 UTC m=+0.108874127 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:38:22 np0005539504 nova_compute[187152]: 2025-11-29 07:38:22.346 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:22.985 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:22.986 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:22.988 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:23 np0005539504 nova_compute[187152]: 2025-11-29 07:38:23.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:23 np0005539504 nova_compute[187152]: 2025-11-29 07:38:23.940 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:23 np0005539504 nova_compute[187152]: 2025-11-29 07:38:23.941 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:38:24 np0005539504 nova_compute[187152]: 2025-11-29 07:38:24.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:24 np0005539504 nova_compute[187152]: 2025-11-29 07:38:24.826 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:24 np0005539504 NetworkManager[55210]: <info>  [1764401904.8265] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Nov 29 02:38:24 np0005539504 NetworkManager[55210]: <info>  [1764401904.8279] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Nov 29 02:38:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:25Z|00629|binding|INFO|Releasing lport be18324a-44e1-4916-a63f-b1a7efeb6fb9 from this chassis (sb_readonly=0)
Nov 29 02:38:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:25Z|00630|binding|INFO|Releasing lport be18324a-44e1-4916-a63f-b1a7efeb6fb9 from this chassis (sb_readonly=0)
Nov 29 02:38:25 np0005539504 nova_compute[187152]: 2025-11-29 07:38:25.154 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:26 np0005539504 nova_compute[187152]: 2025-11-29 07:38:26.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:26 np0005539504 nova_compute[187152]: 2025-11-29 07:38:26.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:38:26 np0005539504 nova_compute[187152]: 2025-11-29 07:38:26.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:38:27 np0005539504 nova_compute[187152]: 2025-11-29 07:38:27.345 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:28 np0005539504 podman[244157]: 2025-11-29 07:38:28.734881637 +0000 UTC m=+0.075871351 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.283 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.285 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.286 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.286 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 37b4277c-4278-4837-a7fa-e4ef827f1078 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.361 187156 DEBUG nova.compute.manager [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-changed-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.362 187156 DEBUG nova.compute.manager [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Refreshing instance network info cache due to event network-changed-541e9918-b113-477f-b173-6e0844275c91. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.362 187156 DEBUG oslo_concurrency.lockutils [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:38:29 np0005539504 nova_compute[187152]: 2025-11-29 07:38:29.673 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:32 np0005539504 nova_compute[187152]: 2025-11-29 07:38:32.348 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:33Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:d8:ce 10.100.0.12
Nov 29 02:38:33 np0005539504 ovn_controller[95182]: 2025-11-29T07:38:33Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:d8:ce 10.100.0.12
Nov 29 02:38:33 np0005539504 nova_compute[187152]: 2025-11-29 07:38:33.710 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.002 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.004 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.005 187156 DEBUG oslo_concurrency.lockutils [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.006 187156 DEBUG nova.network.neutron [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Refreshing network info cache for port 541e9918-b113-477f-b173-6e0844275c91 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.009 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.011 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.098 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.099 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.099 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.099 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.508 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.576 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.577 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.659 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.677 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.838 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.840 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5512MB free_disk=73.0459976196289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.840 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:34 np0005539504 nova_compute[187152]: 2025-11-29 07:38:34.841 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.114 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 37b4277c-4278-4837-a7fa-e4ef827f1078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.115 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.115 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.155 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.248 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.468 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.469 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.689 187156 DEBUG nova.network.neutron [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updated VIF entry in instance network info cache for port 541e9918-b113-477f-b173-6e0844275c91. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.691 187156 DEBUG nova.network.neutron [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:38:35 np0005539504 nova_compute[187152]: 2025-11-29 07:38:35.709 187156 DEBUG oslo_concurrency.lockutils [req-f163a8da-7daf-49e0-91f2-068ac601b1b0 req-6d70f134-91e3-4446-9274-06a7da13de86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:38:37 np0005539504 nova_compute[187152]: 2025-11-29 07:38:37.351 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:37 np0005539504 nova_compute[187152]: 2025-11-29 07:38:37.395 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:39 np0005539504 nova_compute[187152]: 2025-11-29 07:38:39.681 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:39 np0005539504 podman[244214]: 2025-11-29 07:38:39.73506361 +0000 UTC m=+0.061948603 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:38:39 np0005539504 podman[244212]: 2025-11-29 07:38:39.737689301 +0000 UTC m=+0.069350564 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:38:39 np0005539504 podman[244213]: 2025-11-29 07:38:39.760134841 +0000 UTC m=+0.089905352 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Nov 29 02:38:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:41.768 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:38:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:41.769 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:38:41 np0005539504 nova_compute[187152]: 2025-11-29 07:38:41.801 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:41 np0005539504 nova_compute[187152]: 2025-11-29 07:38:41.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:42 np0005539504 nova_compute[187152]: 2025-11-29 07:38:42.354 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:42 np0005539504 nova_compute[187152]: 2025-11-29 07:38:42.796 187156 DEBUG nova.compute.manager [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:38:42 np0005539504 nova_compute[187152]: 2025-11-29 07:38:42.884 187156 INFO nova.compute.manager [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] instance snapshotting#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.110 187156 INFO nova.virt.libvirt.driver [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Beginning live snapshot process#033[00m
Nov 29 02:38:43 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.336 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.402 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.403 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.490 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json -f qcow2" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.505 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.571 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.573 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxs3qp4q1/eed864503003490aa57b158c2754406a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.756 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxs3qp4q1/eed864503003490aa57b158c2754406a.delta 1073741824" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.758 187156 INFO nova.virt.libvirt.driver [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:38:43 np0005539504 nova_compute[187152]: 2025-11-29 07:38:43.846 187156 DEBUG nova.virt.libvirt.guest [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:38:43 np0005539504 podman[244291]: 2025-11-29 07:38:43.866937841 +0000 UTC m=+0.066452326 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:38:43 np0005539504 podman[244293]: 2025-11-29 07:38:43.906026162 +0000 UTC m=+0.102905215 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:38:44 np0005539504 nova_compute[187152]: 2025-11-29 07:38:44.350 187156 DEBUG nova.virt.libvirt.guest [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:38:44 np0005539504 nova_compute[187152]: 2025-11-29 07:38:44.355 187156 INFO nova.virt.libvirt.driver [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:38:44 np0005539504 nova_compute[187152]: 2025-11-29 07:38:44.401 187156 DEBUG nova.privsep.utils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:38:44 np0005539504 nova_compute[187152]: 2025-11-29 07:38:44.402 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxs3qp4q1/eed864503003490aa57b158c2754406a.delta /var/lib/nova/instances/snapshots/tmpxs3qp4q1/eed864503003490aa57b158c2754406a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:44 np0005539504 nova_compute[187152]: 2025-11-29 07:38:44.683 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:46 np0005539504 nova_compute[187152]: 2025-11-29 07:38:46.359 187156 DEBUG oslo_concurrency.processutils [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxs3qp4q1/eed864503003490aa57b158c2754406a.delta /var/lib/nova/instances/snapshots/tmpxs3qp4q1/eed864503003490aa57b158c2754406a" returned: 0 in 1.958s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:46 np0005539504 nova_compute[187152]: 2025-11-29 07:38:46.367 187156 INFO nova.virt.libvirt.driver [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:38:47 np0005539504 nova_compute[187152]: 2025-11-29 07:38:47.357 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:47.991 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'name': 'tempest-TestSnapshotPattern-server-1621430489', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000099', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '66de5f0713944c28a20250b7fbccc130', 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'hostId': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:38:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:47.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.022 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.read.latency volume: 565611355 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.024 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.read.latency volume: 23470602 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6cb11d7-b4df-4009-a639-c2fc40736d9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 565611355, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:47.993653', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711d1a42-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'b92eb1a779a9625e5d99b568954964192d04fcb5a170300a4dbf0eebbcc1ee50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23470602, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:47.993653', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711d47ba-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'a221793e7b5c4bbc05917bc40e8d46101536f457293f7b292e1e3ae0e0f5be24'}]}, 'timestamp': '2025-11-29 07:38:48.025429', '_unique_id': 'f04f8794c9ce460eaf757c24640b3cf4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.037 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 37b4277c-4278-4837-a7fa-e4ef827f1078 / tap541e9918-b1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.037 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.incoming.packets volume: 58 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db382347-e55d-47b8-aafa-9c343a078198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 58, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.033792', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '711f49a2-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '529ae15ed0d4816d0ea86fcd4d36c80aa69a2b92bc6146c5095e1194ea722256'}]}, 'timestamp': '2025-11-29 07:38:48.038493', '_unique_id': 'f7c3951ff3834ba19c99b6326da81f2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.039 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.041 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.041 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ac130a8-8142-407d-948e-e74840e9e4db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.041137', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '711fc4c2-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '81d6a1c1e1f1be86d92aac09b6166afe3e857bb0cc8d948f99c9ac4730eddeab'}]}, 'timestamp': '2025-11-29 07:38:48.041545', '_unique_id': '3eeadf53a15c4b268e90e07f2123e762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.043 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f5a5486-f8c7-471c-82d4-ed3748e3fb74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.043277', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '71201634-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '8ed1c535e28e6636022656a95b2aad82e5b7e0a565173b01b8110922a92d919e'}]}, 'timestamp': '2025-11-29 07:38:48.043591', '_unique_id': 'ed4475b337f440439f65a01802f276bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.057 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.057 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db79c5cd-0539-4b51-8ab7-81a58bbf4a80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.045725', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71224256-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.980430389, 'message_signature': '43c5dc1897d7c6379e5d20f0d19eaf76c69348b81d112c13975f4eb393fbf19d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.045725', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71225232-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.980430389, 'message_signature': 'bb415e4d4682a333ed31ae9a92be86194b6e4da320e84b611a973aff28abf457'}]}, 'timestamp': '2025-11-29 07:38:48.058240', '_unique_id': '07fb92cb5c0144a4a9131c95f8a83da5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.060 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27fec2dc-eec4-470d-9281-a722eb1ca6a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.060674', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '7122bd58-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '3dd4b59890aefce5d85cb7053ccc9b63c3220cb82b704fd64700d7628f45109a'}]}, 'timestamp': '2025-11-29 07:38:48.061006', '_unique_id': 'aaebde8acb6d492c942e913707f7252f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.062 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.062 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>]
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.063 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.063 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '821e0bbf-ec2e-4ceb-aa76-a0d3e88bb29d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.063314', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712325fe-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.980430389, 'message_signature': '7e1dee01fd2ac2ffa82e825c60a5d867b1242d204799a5d8ab8235d7dc4e23ef'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.063314', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712331f2-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.980430389, 'message_signature': '2a6b2b826af71df510402781f5882b947d6641d7da8cc0e0bfa490de4416bd8b'}]}, 'timestamp': '2025-11-29 07:38:48.063948', '_unique_id': '1006c15c989b43c2a10943fbc3fabca3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.064 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.065 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.065 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.write.requests volume: 339 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.065 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57b3c45b-5ffb-4bea-96b4-bcfc5a54571a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 339, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.065603', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71237e32-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'e79fce4d9e47eed7e11fede8392cf1f6e03ea8e574aaad48ece14a93169bfc26'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.065603', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712387f6-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'f8d0d2ec7573a9d61d4fe4951164d1e12a02172da82f20ed67b210a572cb8b46'}]}, 'timestamp': '2025-11-29 07:38:48.066145', '_unique_id': '770ea60f7a4c4ef088605770007e22bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.067 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.067 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.incoming.bytes volume: 10389 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c527302e-da96-4302-8d8b-6c7cb3be0e45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10389, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.067869', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '7123d74c-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '089ec85a23250fe8222fed658e975785d29388ecf0e5cf5f011a88ec60a47a32'}]}, 'timestamp': '2025-11-29 07:38:48.068228', '_unique_id': 'b0d4ecfc00274136a1507355f7d0d32d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.068 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.069 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.070 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.read.bytes volume: 30218752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.070 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5783ba75-627e-4822-99a4-6d472d0e2bb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30218752, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.070248', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712439bc-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'b20e69368030e89208a6cc649cf3339403b64a4faa73fd88eeb084b2157ac7fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.070248', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712447d6-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'f512f6ccc98f5d781d9d1531eeeadd513771b88ec746c23172d9828390c4cfb8'}]}, 'timestamp': '2025-11-29 07:38:48.071080', '_unique_id': '85bb0a3736484967b062d7d93e2d28e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.074 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ab9fd4b-d8e7-4308-8629-dd7c968a6eec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.074181', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '7124cfc6-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '7f6e971c4c0e7e06e28d3b200261ff2b162021c7d14cbae1b997187592406cab'}]}, 'timestamp': '2025-11-29 07:38:48.074573', '_unique_id': '8e9f2aacde714bce9e03483c7db3c12c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.076 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>]
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>]
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1621430489>]
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.077 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.write.latency volume: 16460677656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.078 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '694bf920-6e8b-485b-b764-f0e8ea268cdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16460677656, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.077872', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71255cca-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'de7e1e260357bdba6d09ff0c0ea3ccafc507fd3d165e56fd94af58ade3d0e7de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.077872', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71256706-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'b928cb2692056a898a73e1aa615fd39acb263a59fbc32e923a0a6ae6825fb8cc'}]}, 'timestamp': '2025-11-29 07:38:48.078437', '_unique_id': 'c690f1b4b69a4302a0bc11e53aa904c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.079 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.080 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.outgoing.bytes volume: 8228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e73a0894-9510-4ff8-9dad-7ae0f9905de4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8228, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.080276', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '7125bb66-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '8110c9aa195c58ef6cacb10dfe7819d976e417c3d86503047211792f18d23769'}]}, 'timestamp': '2025-11-29 07:38:48.080602', '_unique_id': '5d972ca97df04e45acb0a02f29144f38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.082 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11fe954b-418d-4dbd-9b75-96e88880b7cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.082239', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '712608f0-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '932c5f6bb5ad56ca93cde7ab3ff3dd5712b6148bed603121e76ca295954e70fa'}]}, 'timestamp': '2025-11-29 07:38:48.082606', '_unique_id': 'b14a2eda5e5242d4a8e6ac8145927398'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.102 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/memory.usage volume: 42.73046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de4babe4-4fc6-49e1-a209-289603d7798e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.73046875, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'timestamp': '2025-11-29T07:38:48.084526', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '71293bce-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7391.037240962, 'message_signature': '1da829706d954ef8c43230b8fe313e076e7aa173d16233fe73ff13b14f5168ab'}]}, 'timestamp': '2025-11-29 07:38:48.103701', '_unique_id': '5a185876c6164f03b7405562e7547cf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.106 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7870ec25-79c3-4866-8ae0-5d9959d18ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.106736', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7129c4d6-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': '2a6fc1f9c0ef124f00994533c72b1662e087739816c74de5ef9a1a487bc9a260'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.106736', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7129ceea-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': 'a2338c566ec8579a4a67b049c693f130da0a3940a98b2182cc651d586340e0d8'}]}, 'timestamp': '2025-11-29 07:38:48.107288', '_unique_id': '67990b07184145f092058087708e0ebc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.109 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.109 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5870bd8-58f7-4ad6-9f69-b3f18b84802e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.109007', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712a1eea-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.980430389, 'message_signature': '3c6e001816eae5a121cd9e9649fb14c301de4f3871ba6428f508d308a0d38946'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.109007', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712a2a98-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.980430389, 'message_signature': 'd2210bc97fdf69e76152cfdb4f958ff08c6dc8972a9edba99fae6989f43ca55c'}]}, 'timestamp': '2025-11-29 07:38:48.109635', '_unique_id': '6a00efafce984bfcb7cf7d79ab3bc4e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.111 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.write.bytes volume: 73134080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.111 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eb8bae7-4489-42ae-9229-c2cae59f8ba4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73134080, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-vda', 'timestamp': '2025-11-29T07:38:48.111166', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712a70fc-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': '05d02fc9e59ea8b7b08fb3a72474ebf654b09302baa8160feb0dfcd361368450'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078-sda', 'timestamp': '2025-11-29T07:38:48.111166', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712a7c8c-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.928377085, 'message_signature': '2c0d7de531ee15aac64ee08a587d48534c45f092f92e47a4e637145f8b7b7227'}]}, 'timestamp': '2025-11-29 07:38:48.111731', '_unique_id': 'de980aeb74d04ac794d9179d85397283'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.113 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/cpu volume: 12240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a6d443b-3393-438f-b449-ae78c7ca99f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12240000000, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'timestamp': '2025-11-29T07:38:48.113305', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'instance-00000099', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '712ac796-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7391.037240962, 'message_signature': '9e8a44b1c478f75ba8a1c434338b5d2e6794e98846f959ac1893422048ef6a72'}]}, 'timestamp': '2025-11-29 07:38:48.113677', '_unique_id': '32718943ab8c4c6e99f3fb09bcef48dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.115 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.outgoing.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cb54a0d-adc8-4663-95f1-ee56524ee3c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.115369', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '712b1688-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': 'cca81caf210d5bf8ee18311598f018a961b9c5ea95a6c6ec659b6a88dcfd0a32'}]}, 'timestamp': '2025-11-29 07:38:48.115684', '_unique_id': '9ce425a8ac3a4c4eaa0e90c95d49cb6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.117 12 DEBUG ceilometer.compute.pollsters [-] 37b4277c-4278-4837-a7fa-e4ef827f1078/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed89ba71-a1d3-4dba-b72f-03572d08c331', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_name': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_name': None, 'resource_id': 'instance-00000099-37b4277c-4278-4837-a7fa-e4ef827f1078-tap541e9918-b1', 'timestamp': '2025-11-29T07:38:48.117186', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1621430489', 'name': 'tap541e9918-b1', 'instance_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'instance_type': 'm1.nano', 'host': '22de457479ed38b3f3ee35c0400d474539bd4444e387cea5c711c4f4', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d4:d8:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap541e9918-b1'}, 'message_id': '712b5c88-ccf6-11f0-8a11-fa163ea726b4', 'monotonic_time': 7390.968532816, 'message_signature': '44ba4f97548b7d5c41aed6d8d4ebc07bd0aa4fa42657808e0d222897774882fe'}]}, 'timestamp': '2025-11-29 07:38:48.117497', '_unique_id': '13bf63bb875a4e8184b185c9589aed68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:38:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:49 np0005539504 nova_compute[187152]: 2025-11-29 07:38:49.689 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:38:51.771 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:52 np0005539504 nova_compute[187152]: 2025-11-29 07:38:52.359 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:52 np0005539504 podman[244356]: 2025-11-29 07:38:52.743421755 +0000 UTC m=+0.079171222 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:38:53 np0005539504 nova_compute[187152]: 2025-11-29 07:38:53.730 187156 INFO nova.virt.libvirt.driver [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Snapshot image upload complete#033[00m
Nov 29 02:38:53 np0005539504 nova_compute[187152]: 2025-11-29 07:38:53.731 187156 INFO nova.compute.manager [None req-416c665f-4628-4eba-bc21-49dda020f59b b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Took 10.83 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:38:54 np0005539504 nova_compute[187152]: 2025-11-29 07:38:54.690 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:57 np0005539504 nova_compute[187152]: 2025-11-29 07:38:57.361 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:59 np0005539504 nova_compute[187152]: 2025-11-29 07:38:59.694 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:59 np0005539504 podman[244375]: 2025-11-29 07:38:59.766782965 +0000 UTC m=+0.090530709 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:39:02 np0005539504 nova_compute[187152]: 2025-11-29 07:39:02.421 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:04 np0005539504 nova_compute[187152]: 2025-11-29 07:39:04.746 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:07 np0005539504 nova_compute[187152]: 2025-11-29 07:39:07.424 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:09 np0005539504 nova_compute[187152]: 2025-11-29 07:39:09.751 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:09 np0005539504 nova_compute[187152]: 2025-11-29 07:39:09.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:10 np0005539504 podman[244399]: 2025-11-29 07:39:10.73542755 +0000 UTC m=+0.057423570 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 02:39:10 np0005539504 podman[244397]: 2025-11-29 07:39:10.736141069 +0000 UTC m=+0.060616887 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:39:10 np0005539504 podman[244398]: 2025-11-29 07:39:10.780534755 +0000 UTC m=+0.099390630 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.187 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.188 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.427 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.567 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.695 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.696 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.705 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.706 187156 INFO nova.compute.claims [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.848 187156 DEBUG nova.compute.provider_tree [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.870 187156 DEBUG nova.scheduler.client.report [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.891 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.892 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.951 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.952 187156 DEBUG nova.network.neutron [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:39:12 np0005539504 nova_compute[187152]: 2025-11-29 07:39:12.984 187156 INFO nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.008 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.127 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.130 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.130 187156 INFO nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Creating image(s)#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.131 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "/var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.132 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "/var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.133 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "/var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.133 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "ff8b78e36cdce7b25ad93cd697b9b0303aca57f3" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.134 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "ff8b78e36cdce7b25ad93cd697b9b0303aca57f3" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.376 187156 DEBUG nova.policy [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b921ad1454834bae9b706b9fa53948b3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '66de5f0713944c28a20250b7fbccc130', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:39:13 np0005539504 nova_compute[187152]: 2025-11-29 07:39:13.941 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:14 np0005539504 podman[244460]: 2025-11-29 07:39:14.71551242 +0000 UTC m=+0.051995093 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:39:14 np0005539504 podman[244461]: 2025-11-29 07:39:14.752170675 +0000 UTC m=+0.087864606 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 02:39:14 np0005539504 nova_compute[187152]: 2025-11-29 07:39:14.972 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:15 np0005539504 nova_compute[187152]: 2025-11-29 07:39:15.250 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:15 np0005539504 nova_compute[187152]: 2025-11-29 07:39:15.310 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:15 np0005539504 nova_compute[187152]: 2025-11-29 07:39:15.311 187156 DEBUG nova.virt.images [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] 961d6e8f-910d-41d1-98db-67de0cbb45ed was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 02:39:15 np0005539504 nova_compute[187152]: 2025-11-29 07:39:15.312 187156 DEBUG nova.privsep.utils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:39:15 np0005539504 nova_compute[187152]: 2025-11-29 07:39:15.312 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.part /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.429 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.446 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.part /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.converted" returned: 0 in 2.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.456 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.546 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3.converted --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.548 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "ff8b78e36cdce7b25ad93cd697b9b0303aca57f3" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.562 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.632 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.634 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "ff8b78e36cdce7b25ad93cd697b9b0303aca57f3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.636 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "ff8b78e36cdce7b25ad93cd697b9b0303aca57f3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.659 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.746 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:17 np0005539504 nova_compute[187152]: 2025-11-29 07:39:17.747 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3,backing_fmt=raw /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:18 np0005539504 nova_compute[187152]: 2025-11-29 07:39:18.673 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3,backing_fmt=raw /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk 1073741824" returned: 0 in 0.926s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:18 np0005539504 nova_compute[187152]: 2025-11-29 07:39:18.674 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "ff8b78e36cdce7b25ad93cd697b9b0303aca57f3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:18 np0005539504 nova_compute[187152]: 2025-11-29 07:39:18.675 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:18 np0005539504 nova_compute[187152]: 2025-11-29 07:39:18.772 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:18 np0005539504 nova_compute[187152]: 2025-11-29 07:39:18.775 187156 DEBUG nova.objects.instance [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lazy-loading 'migration_context' on Instance uuid 44fe8388-2086-4a84-9dce-66f7d619dea8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:39:18 np0005539504 nova_compute[187152]: 2025-11-29 07:39:18.811 187156 DEBUG nova.network.neutron [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Successfully created port: 1595d442-4805-406e-8a60-379227c324e1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.081 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.082 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Ensure instance console log exists: /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.082 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.083 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.083 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:19 np0005539504 nova_compute[187152]: 2025-11-29 07:39:19.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:20 np0005539504 nova_compute[187152]: 2025-11-29 07:39:20.906 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:20.907 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:39:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:20.908 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:39:22 np0005539504 nova_compute[187152]: 2025-11-29 07:39:22.431 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:22.986 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:22.987 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:22.988 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:23 np0005539504 podman[244535]: 2025-11-29 07:39:23.756953335 +0000 UTC m=+0.090473808 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute)
Nov 29 02:39:23 np0005539504 nova_compute[187152]: 2025-11-29 07:39:23.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:23 np0005539504 nova_compute[187152]: 2025-11-29 07:39:23.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:39:24 np0005539504 nova_compute[187152]: 2025-11-29 07:39:24.976 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:25 np0005539504 nova_compute[187152]: 2025-11-29 07:39:25.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.755 187156 DEBUG nova.network.neutron [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Successfully updated port: 1595d442-4805-406e-8a60-379227c324e1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.780 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.780 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquired lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.780 187156 DEBUG nova.network.neutron [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.926 187156 DEBUG nova.compute.manager [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-changed-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.927 187156 DEBUG nova.compute.manager [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Refreshing instance network info cache due to event network-changed-1595d442-4805-406e-8a60-379227c324e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.927 187156 DEBUG oslo_concurrency.lockutils [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.973 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.973 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.974 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:26 np0005539504 nova_compute[187152]: 2025-11-29 07:39:26.974 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.052 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.115 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.116 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.181 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.239 187156 DEBUG nova.network.neutron [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.409 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.411 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5557MB free_disk=72.97839736938477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.411 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.411 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.434 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.506 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 37b4277c-4278-4837-a7fa-e4ef827f1078 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.507 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 44fe8388-2086-4a84-9dce-66f7d619dea8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.507 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.507 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.590 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.612 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.637 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:39:27 np0005539504 nova_compute[187152]: 2025-11-29 07:39:27.638 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.150 187156 DEBUG nova.network.neutron [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updating instance_info_cache with network_info: [{"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.178 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Releasing lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.178 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Instance network_info: |[{"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.179 187156 DEBUG oslo_concurrency.lockutils [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.179 187156 DEBUG nova.network.neutron [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Refreshing network info cache for port 1595d442-4805-406e-8a60-379227c324e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.182 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Start _get_guest_xml network_info=[{"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='36fd044383a2a32cbb41b87b9ee46e19',container_format='bare',created_at=2025-11-29T07:38:42Z,direct_url=<?>,disk_format='qcow2',id=961d6e8f-910d-41d1-98db-67de0cbb45ed,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-989339328',owner='66de5f0713944c28a20250b7fbccc130',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-29T07:38:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '961d6e8f-910d-41d1-98db-67de0cbb45ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.188 187156 WARNING nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.194 187156 DEBUG nova.virt.libvirt.host [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.195 187156 DEBUG nova.virt.libvirt.host [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.206 187156 DEBUG nova.virt.libvirt.host [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.207 187156 DEBUG nova.virt.libvirt.host [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.209 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.209 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='36fd044383a2a32cbb41b87b9ee46e19',container_format='bare',created_at=2025-11-29T07:38:42Z,direct_url=<?>,disk_format='qcow2',id=961d6e8f-910d-41d1-98db-67de0cbb45ed,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-989339328',owner='66de5f0713944c28a20250b7fbccc130',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-29T07:38:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.210 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.210 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.211 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.211 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.211 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.212 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.212 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.213 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.213 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.213 187156 DEBUG nova.virt.hardware [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.219 187156 DEBUG nova.virt.libvirt.vif [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1929841105',display_name='tempest-TestSnapshotPattern-server-1929841105',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1929841105',id=156,image_ref='961d6e8f-910d-41d1-98db-67de0cbb45ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGeGkco1TqFqMrIhoEE73ZhDAmJJAptCCvJ7Qwh9Z7UI8dpQ1Egx0vm1HBJKnTjax+KcM/ISl5nzEEt0JBuJsVC3CZ+KIvxrsxwf2GDJ8t8s6c8ZHa76XQJmPLVjwVs32w==',key_name='tempest-TestSnapshotPattern-1847051545',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='66de5f0713944c28a20250b7fbccc130',ramdisk_id='',reservation_id='r-yskb080m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='37b4277c-4278-4837-a7fa-e4ef827f1078',image_min_disk='1',image_min_ram='0',image_owner_id='66de5f0713944c28a20250b7fbccc130',image_owner_project_name='tempest-TestSnapshotPattern-737443016',image_owner_user_name='tempest-TestSnapshotPattern-737443016-project-member',image_user_id='b921ad1454834bae9b706b9fa53948b3',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-737443016',owner_user_name='tempest-TestSnapshotPattern-737443016-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:39:13Z,user_data=None,user_id='b921ad1454834bae9b706b9fa53948b3',uuid=44fe8388-2086-4a84-9dce-66f7d619dea8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.220 187156 DEBUG nova.network.os_vif_util [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converting VIF {"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.221 187156 DEBUG nova.network.os_vif_util [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.222 187156 DEBUG nova.objects.instance [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lazy-loading 'pci_devices' on Instance uuid 44fe8388-2086-4a84-9dce-66f7d619dea8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.253 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <uuid>44fe8388-2086-4a84-9dce-66f7d619dea8</uuid>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <name>instance-0000009c</name>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestSnapshotPattern-server-1929841105</nova:name>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:39:28</nova:creationTime>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:user uuid="b921ad1454834bae9b706b9fa53948b3">tempest-TestSnapshotPattern-737443016-project-member</nova:user>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:project uuid="66de5f0713944c28a20250b7fbccc130">tempest-TestSnapshotPattern-737443016</nova:project>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="961d6e8f-910d-41d1-98db-67de0cbb45ed"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        <nova:port uuid="1595d442-4805-406e-8a60-379227c324e1">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <entry name="serial">44fe8388-2086-4a84-9dce-66f7d619dea8</entry>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <entry name="uuid">44fe8388-2086-4a84-9dce-66f7d619dea8</entry>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.config"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:ec:bd:a2"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <target dev="tap1595d442-48"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/console.log" append="off"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:39:28 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:39:28 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:39:28 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:39:28 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.255 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Preparing to wait for external event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.255 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.255 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.256 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.257 187156 DEBUG nova.virt.libvirt.vif [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1929841105',display_name='tempest-TestSnapshotPattern-server-1929841105',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1929841105',id=156,image_ref='961d6e8f-910d-41d1-98db-67de0cbb45ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGeGkco1TqFqMrIhoEE73ZhDAmJJAptCCvJ7Qwh9Z7UI8dpQ1Egx0vm1HBJKnTjax+KcM/ISl5nzEEt0JBuJsVC3CZ+KIvxrsxwf2GDJ8t8s6c8ZHa76XQJmPLVjwVs32w==',key_name='tempest-TestSnapshotPattern-1847051545',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='66de5f0713944c28a20250b7fbccc130',ramdisk_id='',reservation_id='r-yskb080m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='37b4277c-4278-4837-a7fa-e4ef827f1078',image_min_disk='1',image_min_ram='0',image_owner_id='66de5f0713944c28a20250b7fbccc130',image_owner_project_name='tempest-TestSnapshotPattern-737443016',image_owner_user_name='tempest-TestSnapshotPattern-737443016-project-member',image_user_id='b921ad1454834bae9b706b9fa53948b3',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-737443016',owner_user_name='tempest-TestSnapshotPattern-737443016-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:39:13Z,user_data=None,user_id='b921ad1454834bae9b706b9fa53948b3',uuid=44fe8388-2086-4a84-9dce-66f7d619dea8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.257 187156 DEBUG nova.network.os_vif_util [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converting VIF {"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.258 187156 DEBUG nova.network.os_vif_util [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.258 187156 DEBUG os_vif [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.259 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.260 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.260 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.269 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.269 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1595d442-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.270 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1595d442-48, col_values=(('external_ids', {'iface-id': '1595d442-4805-406e-8a60-379227c324e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:bd:a2', 'vm-uuid': '44fe8388-2086-4a84-9dce-66f7d619dea8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.271 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 NetworkManager[55210]: <info>  [1764401968.2731] manager: (tap1595d442-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.274 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.281 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.282 187156 INFO os_vif [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48')#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.357 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.357 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.358 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] No VIF found with MAC fa:16:3e:ec:bd:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.358 187156 INFO nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Using config drive#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.640 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.642 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.642 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.671 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.707 187156 INFO nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Creating config drive at /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.config#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.715 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplh3wx2ea execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.828 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.828 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.829 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.829 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 37b4277c-4278-4837-a7fa-e4ef827f1078 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.844 187156 DEBUG oslo_concurrency.processutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplh3wx2ea" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:39:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:28.910 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:28 np0005539504 NetworkManager[55210]: <info>  [1764401968.9391] manager: (tap1595d442-48): new Tun device (/org/freedesktop/NetworkManager/Devices/280)
Nov 29 02:39:28 np0005539504 kernel: tap1595d442-48: entered promiscuous mode
Nov 29 02:39:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:28Z|00631|binding|INFO|Claiming lport 1595d442-4805-406e-8a60-379227c324e1 for this chassis.
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.941 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:28Z|00632|binding|INFO|1595d442-4805-406e-8a60-379227c324e1: Claiming fa:16:3e:ec:bd:a2 10.100.0.7
Nov 29 02:39:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:28Z|00633|binding|INFO|Setting lport 1595d442-4805-406e-8a60-379227c324e1 ovn-installed in OVS
Nov 29 02:39:28 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:28Z|00634|binding|INFO|Setting lport 1595d442-4805-406e-8a60-379227c324e1 up in Southbound
Nov 29 02:39:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:28.958 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:bd:a2 10.100.0.7'], port_security=['fa:16:3e:ec:bd:a2 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '44fe8388-2086-4a84-9dce-66f7d619dea8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19f12258-2ca1-4bcd-90a1-babd862276cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '66de5f0713944c28a20250b7fbccc130', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd9016a5f-cca6-4f8a-be77-5b7f8d76145d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e91bbf94-b791-4827-92dd-47385d8f6f11, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1595d442-4805-406e-8a60-379227c324e1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.960 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:28.962 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1595d442-4805-406e-8a60-379227c324e1 in datapath 19f12258-2ca1-4bcd-90a1-babd862276cb bound to our chassis#033[00m
Nov 29 02:39:28 np0005539504 nova_compute[187152]: 2025-11-29 07:39:28.966 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:28.968 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19f12258-2ca1-4bcd-90a1-babd862276cb#033[00m
Nov 29 02:39:28 np0005539504 systemd-machined[153423]: New machine qemu-82-instance-0000009c.
Nov 29 02:39:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:28.988 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ef68d7dc-5c10-4822-bbac-1566138673a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:28 np0005539504 systemd-udevd[244585]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:39:29 np0005539504 systemd[1]: Started Virtual Machine qemu-82-instance-0000009c.
Nov 29 02:39:29 np0005539504 NetworkManager[55210]: <info>  [1764401969.0058] device (tap1595d442-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:39:29 np0005539504 NetworkManager[55210]: <info>  [1764401969.0069] device (tap1595d442-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.028 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[57147520-80a5-4b7e-a418-99b870b1976b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.032 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5da185f2-0ab5-4df6-b0de-ddf82748b9c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.065 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[78ce9aeb-7cff-4329-9cd8-de9ac3bb233f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.084 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc51d9a-bda8-4d70-8279-b11784f0630f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19f12258-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:47:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735890, 'reachable_time': 35229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244597, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.099 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c116fb3-763a-4f98-8095-440ded5c4a8d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19f12258-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 735904, 'tstamp': 735904}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244598, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19f12258-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 735907, 'tstamp': 735907}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244598, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.101 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19f12258-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.103 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.122 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.123 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19f12258-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.124 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.124 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19f12258-20, col_values=(('external_ids', {'iface-id': 'be18324a-44e1-4916-a63f-b1a7efeb6fb9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:39:29.124 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.346 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401969.3454602, 44fe8388-2086-4a84-9dce-66f7d619dea8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.347 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] VM Started (Lifecycle Event)#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.372 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.380 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401969.345853, 44fe8388-2086-4a84-9dce-66f7d619dea8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.380 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.561 187156 DEBUG nova.compute.manager [req-8a18b093-cbb4-440d-9221-ff2868712674 req-d1e34ae8-95d0-4595-a363-76e13590b2b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.561 187156 DEBUG oslo_concurrency.lockutils [req-8a18b093-cbb4-440d-9221-ff2868712674 req-d1e34ae8-95d0-4595-a363-76e13590b2b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.562 187156 DEBUG oslo_concurrency.lockutils [req-8a18b093-cbb4-440d-9221-ff2868712674 req-d1e34ae8-95d0-4595-a363-76e13590b2b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.563 187156 DEBUG oslo_concurrency.lockutils [req-8a18b093-cbb4-440d-9221-ff2868712674 req-d1e34ae8-95d0-4595-a363-76e13590b2b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.564 187156 DEBUG nova.compute.manager [req-8a18b093-cbb4-440d-9221-ff2868712674 req-d1e34ae8-95d0-4595-a363-76e13590b2b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Processing event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.565 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.570 187156 DEBUG nova.virt.libvirt.driver [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.574 187156 INFO nova.virt.libvirt.driver [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Instance spawned successfully.#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.575 187156 INFO nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Took 16.45 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.575 187156 DEBUG nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.584 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.587 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764401969.569688, 44fe8388-2086-4a84-9dce-66f7d619dea8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.587 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.641 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.646 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.723 187156 INFO nova.compute.manager [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Took 17.07 seconds to build instance.#033[00m
Nov 29 02:39:29 np0005539504 nova_compute[187152]: 2025-11-29 07:39:29.762 187156 DEBUG oslo_concurrency.lockutils [None req-54aff0a2-de3f-4828-8198-74ee979c8065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:30 np0005539504 nova_compute[187152]: 2025-11-29 07:39:30.000 187156 DEBUG nova.network.neutron [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updated VIF entry in instance network info cache for port 1595d442-4805-406e-8a60-379227c324e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:39:30 np0005539504 nova_compute[187152]: 2025-11-29 07:39:30.001 187156 DEBUG nova.network.neutron [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updating instance_info_cache with network_info: [{"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:30 np0005539504 podman[244607]: 2025-11-29 07:39:30.805308636 +0000 UTC m=+0.101802786 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:39:31 np0005539504 nova_compute[187152]: 2025-11-29 07:39:31.052 187156 DEBUG oslo_concurrency.lockutils [req-e6769d22-6a23-4f63-8f00-c2ebb2905c58 req-fa8af1d6-5334-4d83-8876-4255233e2833 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.153 187156 DEBUG nova.compute.manager [req-7cecc440-ea31-44af-a878-cd61c3135604 req-e4d9ba48-8bb2-4877-a854-bbc384ec4a52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.154 187156 DEBUG oslo_concurrency.lockutils [req-7cecc440-ea31-44af-a878-cd61c3135604 req-e4d9ba48-8bb2-4877-a854-bbc384ec4a52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.154 187156 DEBUG oslo_concurrency.lockutils [req-7cecc440-ea31-44af-a878-cd61c3135604 req-e4d9ba48-8bb2-4877-a854-bbc384ec4a52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.154 187156 DEBUG oslo_concurrency.lockutils [req-7cecc440-ea31-44af-a878-cd61c3135604 req-e4d9ba48-8bb2-4877-a854-bbc384ec4a52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.155 187156 DEBUG nova.compute.manager [req-7cecc440-ea31-44af-a878-cd61c3135604 req-e4d9ba48-8bb2-4877-a854-bbc384ec4a52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] No waiting events found dispatching network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.155 187156 WARNING nova.compute.manager [req-7cecc440-ea31-44af-a878-cd61c3135604 req-e4d9ba48-8bb2-4877-a854-bbc384ec4a52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received unexpected event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.437 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.683 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.745 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.746 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:39:32 np0005539504 nova_compute[187152]: 2025-11-29 07:39:32.746 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:33 np0005539504 nova_compute[187152]: 2025-11-29 07:39:33.276 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:36 np0005539504 nova_compute[187152]: 2025-11-29 07:39:36.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:37 np0005539504 nova_compute[187152]: 2025-11-29 07:39:37.439 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:38 np0005539504 nova_compute[187152]: 2025-11-29 07:39:38.280 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:41 np0005539504 podman[244628]: 2025-11-29 07:39:41.766714335 +0000 UTC m=+0.100496420 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:39:41 np0005539504 podman[244630]: 2025-11-29 07:39:41.774842705 +0000 UTC m=+0.097206240 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:39:41 np0005539504 podman[244629]: 2025-11-29 07:39:41.80154211 +0000 UTC m=+0.122699642 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:39:42 np0005539504 nova_compute[187152]: 2025-11-29 07:39:42.441 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:43 np0005539504 nova_compute[187152]: 2025-11-29 07:39:43.282 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:43 np0005539504 nova_compute[187152]: 2025-11-29 07:39:43.945 187156 DEBUG nova.compute.manager [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-changed-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:43 np0005539504 nova_compute[187152]: 2025-11-29 07:39:43.946 187156 DEBUG nova.compute.manager [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Refreshing instance network info cache due to event network-changed-1595d442-4805-406e-8a60-379227c324e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:39:43 np0005539504 nova_compute[187152]: 2025-11-29 07:39:43.947 187156 DEBUG oslo_concurrency.lockutils [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:39:43 np0005539504 nova_compute[187152]: 2025-11-29 07:39:43.947 187156 DEBUG oslo_concurrency.lockutils [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:39:43 np0005539504 nova_compute[187152]: 2025-11-29 07:39:43.948 187156 DEBUG nova.network.neutron [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Refreshing network info cache for port 1595d442-4805-406e-8a60-379227c324e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:39:45 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:45Z|00070|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.7
Nov 29 02:39:45 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:45Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:ec:bd:a2 10.100.0.7
Nov 29 02:39:45 np0005539504 podman[244705]: 2025-11-29 07:39:45.718752573 +0000 UTC m=+0.053640617 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:39:45 np0005539504 nova_compute[187152]: 2025-11-29 07:39:45.733 187156 DEBUG nova.network.neutron [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updated VIF entry in instance network info cache for port 1595d442-4805-406e-8a60-379227c324e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:39:45 np0005539504 nova_compute[187152]: 2025-11-29 07:39:45.733 187156 DEBUG nova.network.neutron [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updating instance_info_cache with network_info: [{"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:45 np0005539504 podman[244706]: 2025-11-29 07:39:45.760324863 +0000 UTC m=+0.085775160 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:39:45 np0005539504 nova_compute[187152]: 2025-11-29 07:39:45.866 187156 DEBUG oslo_concurrency.lockutils [req-55f2b24e-caad-4546-bd08-652fc3d5982f req-1217602b-cdd8-44bb-97b8-60bf196ea8aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:39:47 np0005539504 nova_compute[187152]: 2025-11-29 07:39:47.444 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:48 np0005539504 nova_compute[187152]: 2025-11-29 07:39:48.285 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:49 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:49Z|00072|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.12 does not match offer 10.100.0.7
Nov 29 02:39:49 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:49Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:ec:bd:a2 10.100.0.7
Nov 29 02:39:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:50Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ec:bd:a2 10.100.0.7
Nov 29 02:39:50 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:50Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ec:bd:a2 10.100.0.7
Nov 29 02:39:52 np0005539504 nova_compute[187152]: 2025-11-29 07:39:52.447 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:53 np0005539504 nova_compute[187152]: 2025-11-29 07:39:53.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:39:54Z|00635|binding|INFO|Releasing lport be18324a-44e1-4916-a63f-b1a7efeb6fb9 from this chassis (sb_readonly=0)
Nov 29 02:39:54 np0005539504 nova_compute[187152]: 2025-11-29 07:39:54.536 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:54 np0005539504 podman[244756]: 2025-11-29 07:39:54.747993562 +0000 UTC m=+0.088530034 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:39:57 np0005539504 nova_compute[187152]: 2025-11-29 07:39:57.448 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:58 np0005539504 nova_compute[187152]: 2025-11-29 07:39:58.291 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:01 np0005539504 nova_compute[187152]: 2025-11-29 07:40:01.399 187156 DEBUG nova.compute.manager [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:40:01 np0005539504 podman[244780]: 2025-11-29 07:40:01.738249194 +0000 UTC m=+0.071507524 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:40:01 np0005539504 nova_compute[187152]: 2025-11-29 07:40:01.994 187156 INFO nova.compute.manager [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] instance snapshotting#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.399 187156 INFO nova.virt.libvirt.driver [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Beginning live snapshot process#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.451 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:02 np0005539504 virtqemud[186569]: invalid argument: disk vda does not have an active block job
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.614 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.698 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk --force-share --output=json -f qcow2" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.699 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.767 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8/disk --force-share --output=json -f qcow2" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.792 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.866 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.867 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpflnmga3c/888736aceda24b288e0a9996814b0508.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.908 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpflnmga3c/888736aceda24b288e0a9996814b0508.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.910 187156 INFO nova.virt.libvirt.driver [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 02:40:02 np0005539504 nova_compute[187152]: 2025-11-29 07:40:02.988 187156 DEBUG nova.virt.libvirt.guest [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] COPY block job progress, current cursor: 0 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:40:03 np0005539504 nova_compute[187152]: 2025-11-29 07:40:03.295 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:03 np0005539504 nova_compute[187152]: 2025-11-29 07:40:03.493 187156 DEBUG nova.virt.libvirt.guest [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] COPY block job progress, current cursor: 1114112 final cursor: 1114112 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 02:40:03 np0005539504 nova_compute[187152]: 2025-11-29 07:40:03.496 187156 INFO nova.virt.libvirt.driver [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 02:40:03 np0005539504 nova_compute[187152]: 2025-11-29 07:40:03.544 187156 DEBUG nova.privsep.utils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:40:03 np0005539504 nova_compute[187152]: 2025-11-29 07:40:03.545 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpflnmga3c/888736aceda24b288e0a9996814b0508.delta /var/lib/nova/instances/snapshots/tmpflnmga3c/888736aceda24b288e0a9996814b0508 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:04 np0005539504 nova_compute[187152]: 2025-11-29 07:40:04.062 187156 DEBUG oslo_concurrency.processutils [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpflnmga3c/888736aceda24b288e0a9996814b0508.delta /var/lib/nova/instances/snapshots/tmpflnmga3c/888736aceda24b288e0a9996814b0508" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:04 np0005539504 nova_compute[187152]: 2025-11-29 07:40:04.063 187156 INFO nova.virt.libvirt.driver [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:40:06 np0005539504 nova_compute[187152]: 2025-11-29 07:40:06.807 187156 INFO nova.virt.libvirt.driver [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Snapshot image upload complete#033[00m
Nov 29 02:40:06 np0005539504 nova_compute[187152]: 2025-11-29 07:40:06.808 187156 INFO nova.compute.manager [None req-4e08d32f-4585-40f4-8c02-f69509d31065 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Took 4.68 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 02:40:07 np0005539504 nova_compute[187152]: 2025-11-29 07:40:07.453 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:08 np0005539504 nova_compute[187152]: 2025-11-29 07:40:08.297 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.831 187156 DEBUG nova.compute.manager [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-changed-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.832 187156 DEBUG nova.compute.manager [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Refreshing instance network info cache due to event network-changed-1595d442-4805-406e-8a60-379227c324e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.833 187156 DEBUG oslo_concurrency.lockutils [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.833 187156 DEBUG oslo_concurrency.lockutils [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.834 187156 DEBUG nova.network.neutron [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Refreshing network info cache for port 1595d442-4805-406e-8a60-379227c324e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.906 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.907 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.907 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.908 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.908 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.924 187156 INFO nova.compute.manager [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Terminating instance#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.937 187156 DEBUG nova.compute.manager [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:09 np0005539504 kernel: tap1595d442-48 (unregistering): left promiscuous mode
Nov 29 02:40:09 np0005539504 NetworkManager[55210]: <info>  [1764402009.9686] device (tap1595d442-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.979 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:40:09Z|00636|binding|INFO|Releasing lport 1595d442-4805-406e-8a60-379227c324e1 from this chassis (sb_readonly=0)
Nov 29 02:40:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:40:09Z|00637|binding|INFO|Setting lport 1595d442-4805-406e-8a60-379227c324e1 down in Southbound
Nov 29 02:40:09 np0005539504 ovn_controller[95182]: 2025-11-29T07:40:09Z|00638|binding|INFO|Removing iface tap1595d442-48 ovn-installed in OVS
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.982 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:09.986 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:bd:a2 10.100.0.7'], port_security=['fa:16:3e:ec:bd:a2 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '44fe8388-2086-4a84-9dce-66f7d619dea8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19f12258-2ca1-4bcd-90a1-babd862276cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '66de5f0713944c28a20250b7fbccc130', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd9016a5f-cca6-4f8a-be77-5b7f8d76145d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e91bbf94-b791-4827-92dd-47385d8f6f11, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=1595d442-4805-406e-8a60-379227c324e1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:09.988 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 1595d442-4805-406e-8a60-379227c324e1 in datapath 19f12258-2ca1-4bcd-90a1-babd862276cb unbound from our chassis#033[00m
Nov 29 02:40:09 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:09.990 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19f12258-2ca1-4bcd-90a1-babd862276cb#033[00m
Nov 29 02:40:09 np0005539504 nova_compute[187152]: 2025-11-29 07:40:09.997 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.010 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[69fec51c-caf0-405a-90ef-ffba4292084b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:10 np0005539504 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Nov 29 02:40:10 np0005539504 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d0000009c.scope: Consumed 15.464s CPU time.
Nov 29 02:40:10 np0005539504 systemd-machined[153423]: Machine qemu-82-instance-0000009c terminated.
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.055 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5a527351-65ad-4ce0-82eb-5810d992cee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.060 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[339e26a0-56cf-4605-9bec-511b10491a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.097 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[9e14ed02-2762-49a4-b9fb-63ef6dd7e076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.123 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0f3bda-0394-4409-bfb2-9673bd0bbc59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19f12258-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:47:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 189], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735890, 'reachable_time': 35229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244839, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.143 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cac61529-23d5-4d12-afec-4e6c67c9fed5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap19f12258-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 735904, 'tstamp': 735904}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244840, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap19f12258-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 735907, 'tstamp': 735907}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244840, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.145 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19f12258-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.147 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.153 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.154 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19f12258-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.155 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.155 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19f12258-20, col_values=(('external_ids', {'iface-id': 'be18324a-44e1-4916-a63f-b1a7efeb6fb9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:10 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:10.155 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:40:10 np0005539504 kernel: tap1595d442-48: entered promiscuous mode
Nov 29 02:40:10 np0005539504 kernel: tap1595d442-48 (unregistering): left promiscuous mode
Nov 29 02:40:10 np0005539504 NetworkManager[55210]: <info>  [1764402010.1693] manager: (tap1595d442-48): new Tun device (/org/freedesktop/NetworkManager/Devices/281)
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.173 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.222 187156 INFO nova.virt.libvirt.driver [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Instance destroyed successfully.#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.222 187156 DEBUG nova.objects.instance [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lazy-loading 'resources' on Instance uuid 44fe8388-2086-4a84-9dce-66f7d619dea8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.244 187156 DEBUG nova.virt.libvirt.vif [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1929841105',display_name='tempest-TestSnapshotPattern-server-1929841105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1929841105',id=156,image_ref='961d6e8f-910d-41d1-98db-67de0cbb45ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGeGkco1TqFqMrIhoEE73ZhDAmJJAptCCvJ7Qwh9Z7UI8dpQ1Egx0vm1HBJKnTjax+KcM/ISl5nzEEt0JBuJsVC3CZ+KIvxrsxwf2GDJ8t8s6c8ZHa76XQJmPLVjwVs32w==',key_name='tempest-TestSnapshotPattern-1847051545',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:39:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='66de5f0713944c28a20250b7fbccc130',ramdisk_id='',reservation_id='r-yskb080m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='37b4277c-4278-4837-a7fa-e4ef827f1078',image_min_disk='1',image_min_ram='0',image_owner_id='66de5f0713944c28a20250b7fbccc130',image_owner_project_name='tempest-TestSnapshotPattern-737443016',image_owner_user_name='tempest-TestSnapshotPattern-737443016-project-member',image_user_id='b921ad1454834bae9b706b9fa53948b3',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-737443016',owner_user_name='tempest-TestSnapshotPattern-737443016-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:40:06Z,user_data=None,user_id='b921ad1454834bae9b706b9fa53948b3',uuid=44fe8388-2086-4a84-9dce-66f7d619dea8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.244 187156 DEBUG nova.network.os_vif_util [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converting VIF {"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.245 187156 DEBUG nova.network.os_vif_util [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.245 187156 DEBUG os_vif [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.248 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.248 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1595d442-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.250 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.255 187156 INFO os_vif [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:bd:a2,bridge_name='br-int',has_traffic_filtering=True,id=1595d442-4805-406e-8a60-379227c324e1,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1595d442-48')#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.256 187156 INFO nova.virt.libvirt.driver [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Deleting instance files /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8_del#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.257 187156 INFO nova.virt.libvirt.driver [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Deletion of /var/lib/nova/instances/44fe8388-2086-4a84-9dce-66f7d619dea8_del complete#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.324 187156 INFO nova.compute.manager [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.324 187156 DEBUG oslo.service.loopingcall [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.324 187156 DEBUG nova.compute.manager [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.325 187156 DEBUG nova.network.neutron [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.336 187156 DEBUG nova.compute.manager [req-683841f8-67f9-4990-abee-868921b4f79c req-9d332d35-3712-4a77-8430-406ca2db2aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-vif-unplugged-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.337 187156 DEBUG oslo_concurrency.lockutils [req-683841f8-67f9-4990-abee-868921b4f79c req-9d332d35-3712-4a77-8430-406ca2db2aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.337 187156 DEBUG oslo_concurrency.lockutils [req-683841f8-67f9-4990-abee-868921b4f79c req-9d332d35-3712-4a77-8430-406ca2db2aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.337 187156 DEBUG oslo_concurrency.lockutils [req-683841f8-67f9-4990-abee-868921b4f79c req-9d332d35-3712-4a77-8430-406ca2db2aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.338 187156 DEBUG nova.compute.manager [req-683841f8-67f9-4990-abee-868921b4f79c req-9d332d35-3712-4a77-8430-406ca2db2aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] No waiting events found dispatching network-vif-unplugged-1595d442-4805-406e-8a60-379227c324e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:40:10 np0005539504 nova_compute[187152]: 2025-11-29 07:40:10.338 187156 DEBUG nova.compute.manager [req-683841f8-67f9-4990-abee-868921b4f79c req-9d332d35-3712-4a77-8430-406ca2db2aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-vif-unplugged-1595d442-4805-406e-8a60-379227c324e1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.104 187156 DEBUG nova.network.neutron [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.486 187156 INFO nova.compute.manager [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Took 1.16 seconds to deallocate network for instance.#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.612 187156 DEBUG nova.network.neutron [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updated VIF entry in instance network info cache for port 1595d442-4805-406e-8a60-379227c324e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.613 187156 DEBUG nova.network.neutron [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updating instance_info_cache with network_info: [{"id": "1595d442-4805-406e-8a60-379227c324e1", "address": "fa:16:3e:ec:bd:a2", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1595d442-48", "ovs_interfaceid": "1595d442-4805-406e-8a60-379227c324e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.636 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.637 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:11 np0005539504 nova_compute[187152]: 2025-11-29 07:40:11.645 187156 DEBUG oslo_concurrency.lockutils [req-8769b363-04da-42ba-9a78-33c7546736b2 req-8f15ccc6-78dc-4303-a0fc-683ae1c61719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-44fe8388-2086-4a84-9dce-66f7d619dea8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.300 187156 DEBUG nova.compute.manager [req-0e6daf5d-2876-4f29-9d60-f3f032753092 req-80aad1a7-9fa9-401c-990a-d815092f5631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-vif-deleted-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.301 187156 INFO nova.compute.manager [req-0e6daf5d-2876-4f29-9d60-f3f032753092 req-80aad1a7-9fa9-401c-990a-d815092f5631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Neutron deleted interface 1595d442-4805-406e-8a60-379227c324e1; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.301 187156 DEBUG nova.network.neutron [req-0e6daf5d-2876-4f29-9d60-f3f032753092 req-80aad1a7-9fa9-401c-990a-d815092f5631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.329 187156 DEBUG nova.compute.manager [req-0e6daf5d-2876-4f29-9d60-f3f032753092 req-80aad1a7-9fa9-401c-990a-d815092f5631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Detach interface failed, port_id=1595d442-4805-406e-8a60-379227c324e1, reason: Instance 44fe8388-2086-4a84-9dce-66f7d619dea8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.455 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.461 187156 DEBUG nova.compute.manager [req-936b2ed0-3283-4a53-8a6a-a946de47fbf6 req-b1daf699-029d-439a-a56b-5c829b599fa2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.461 187156 DEBUG oslo_concurrency.lockutils [req-936b2ed0-3283-4a53-8a6a-a946de47fbf6 req-b1daf699-029d-439a-a56b-5c829b599fa2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.461 187156 DEBUG oslo_concurrency.lockutils [req-936b2ed0-3283-4a53-8a6a-a946de47fbf6 req-b1daf699-029d-439a-a56b-5c829b599fa2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.462 187156 DEBUG oslo_concurrency.lockutils [req-936b2ed0-3283-4a53-8a6a-a946de47fbf6 req-b1daf699-029d-439a-a56b-5c829b599fa2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.462 187156 DEBUG nova.compute.manager [req-936b2ed0-3283-4a53-8a6a-a946de47fbf6 req-b1daf699-029d-439a-a56b-5c829b599fa2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] No waiting events found dispatching network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.462 187156 WARNING nova.compute.manager [req-936b2ed0-3283-4a53-8a6a-a946de47fbf6 req-b1daf699-029d-439a-a56b-5c829b599fa2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Received unexpected event network-vif-plugged-1595d442-4805-406e-8a60-379227c324e1 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.563 187156 DEBUG nova.compute.provider_tree [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.583 187156 DEBUG nova.scheduler.client.report [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.620 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.653 187156 INFO nova.scheduler.client.report [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Deleted allocations for instance 44fe8388-2086-4a84-9dce-66f7d619dea8#033[00m
Nov 29 02:40:12 np0005539504 podman[244857]: 2025-11-29 07:40:12.680764411 +0000 UTC m=+0.077655730 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:40:12 np0005539504 podman[244858]: 2025-11-29 07:40:12.683357801 +0000 UTC m=+0.073041675 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64)
Nov 29 02:40:12 np0005539504 podman[244859]: 2025-11-29 07:40:12.72495292 +0000 UTC m=+0.106896293 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:40:12 np0005539504 nova_compute[187152]: 2025-11-29 07:40:12.733 187156 DEBUG oslo_concurrency.lockutils [None req-6b3dbbde-9f97-4b88-82f5-a3ed0229c090 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "44fe8388-2086-4a84-9dce-66f7d619dea8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:13 np0005539504 nova_compute[187152]: 2025-11-29 07:40:13.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:14.786 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:14 np0005539504 nova_compute[187152]: 2025-11-29 07:40:14.786 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:14.788 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:40:15 np0005539504 nova_compute[187152]: 2025-11-29 07:40:15.251 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:15 np0005539504 nova_compute[187152]: 2025-11-29 07:40:15.770 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.331 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.331 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.332 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.332 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.332 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.390 187156 INFO nova.compute.manager [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Terminating instance#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.560 187156 DEBUG nova.compute.manager [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:40:16 np0005539504 kernel: tap541e9918-b1 (unregistering): left promiscuous mode
Nov 29 02:40:16 np0005539504 NetworkManager[55210]: <info>  [1764402016.6440] device (tap541e9918-b1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:40:16 np0005539504 ovn_controller[95182]: 2025-11-29T07:40:16Z|00639|binding|INFO|Releasing lport 541e9918-b113-477f-b173-6e0844275c91 from this chassis (sb_readonly=0)
Nov 29 02:40:16 np0005539504 ovn_controller[95182]: 2025-11-29T07:40:16Z|00640|binding|INFO|Setting lport 541e9918-b113-477f-b173-6e0844275c91 down in Southbound
Nov 29 02:40:16 np0005539504 ovn_controller[95182]: 2025-11-29T07:40:16Z|00641|binding|INFO|Removing iface tap541e9918-b1 ovn-installed in OVS
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.669 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000099.scope: Deactivated successfully.
Nov 29 02:40:16 np0005539504 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000099.scope: Consumed 18.759s CPU time.
Nov 29 02:40:16 np0005539504 systemd-machined[153423]: Machine qemu-81-instance-00000099 terminated.
Nov 29 02:40:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:16.723 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:d8:ce 10.100.0.12'], port_security=['fa:16:3e:d4:d8:ce 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '37b4277c-4278-4837-a7fa-e4ef827f1078', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19f12258-2ca1-4bcd-90a1-babd862276cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '66de5f0713944c28a20250b7fbccc130', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd9016a5f-cca6-4f8a-be77-5b7f8d76145d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e91bbf94-b791-4827-92dd-47385d8f6f11, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=541e9918-b113-477f-b173-6e0844275c91) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:16.725 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 541e9918-b113-477f-b173-6e0844275c91 in datapath 19f12258-2ca1-4bcd-90a1-babd862276cb unbound from our chassis#033[00m
Nov 29 02:40:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:16.727 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19f12258-2ca1-4bcd-90a1-babd862276cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:40:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:16.730 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[adaa45e0-c06f-4d7d-a1ff-2923fff39e70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:16.731 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb namespace which is not needed anymore#033[00m
Nov 29 02:40:16 np0005539504 podman[244919]: 2025-11-29 07:40:16.746905336 +0000 UTC m=+0.081320309 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.761 187156 DEBUG nova.compute.manager [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-changed-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.762 187156 DEBUG nova.compute.manager [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Refreshing instance network info cache due to event network-changed-541e9918-b113-477f-b173-6e0844275c91. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.762 187156 DEBUG oslo_concurrency.lockutils [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.762 187156 DEBUG oslo_concurrency.lockutils [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.762 187156 DEBUG nova.network.neutron [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Refreshing network info cache for port 541e9918-b113-477f-b173-6e0844275c91 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:40:16 np0005539504 podman[244922]: 2025-11-29 07:40:16.775439301 +0000 UTC m=+0.104537369 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller)
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.787 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.791 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.833 187156 INFO nova.virt.libvirt.driver [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Instance destroyed successfully.#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.835 187156 DEBUG nova.objects.instance [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lazy-loading 'resources' on Instance uuid 37b4277c-4278-4837-a7fa-e4ef827f1078 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:40:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [NOTICE]   (244122) : haproxy version is 2.8.14-c23fe91
Nov 29 02:40:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [NOTICE]   (244122) : path to executable is /usr/sbin/haproxy
Nov 29 02:40:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [WARNING]  (244122) : Exiting Master process...
Nov 29 02:40:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [WARNING]  (244122) : Exiting Master process...
Nov 29 02:40:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [ALERT]    (244122) : Current worker (244124) exited with code 143 (Terminated)
Nov 29 02:40:16 np0005539504 neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb[244118]: [WARNING]  (244122) : All workers exited. Exiting... (0)
Nov 29 02:40:16 np0005539504 systemd[1]: libpod-0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57.scope: Deactivated successfully.
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.925 187156 DEBUG nova.virt.libvirt.vif [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:38:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1621430489',display_name='tempest-TestSnapshotPattern-server-1621430489',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1621430489',id=153,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGeGkco1TqFqMrIhoEE73ZhDAmJJAptCCvJ7Qwh9Z7UI8dpQ1Egx0vm1HBJKnTjax+KcM/ISl5nzEEt0JBuJsVC3CZ+KIvxrsxwf2GDJ8t8s6c8ZHa76XQJmPLVjwVs32w==',key_name='tempest-TestSnapshotPattern-1847051545',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:38:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='66de5f0713944c28a20250b7fbccc130',ramdisk_id='',reservation_id='r-q3ckjdh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-737443016',owner_user_name='tempest-TestSnapshotPattern-737443016-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:38:54Z,user_data=None,user_id='b921ad1454834bae9b706b9fa53948b3',uuid=37b4277c-4278-4837-a7fa-e4ef827f1078,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.926 187156 DEBUG nova.network.os_vif_util [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converting VIF {"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.926 187156 DEBUG nova.network.os_vif_util [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.927 187156 DEBUG os_vif [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:40:16 np0005539504 podman[245005]: 2025-11-29 07:40:16.927347916 +0000 UTC m=+0.077789093 container died 0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.929 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.930 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap541e9918-b1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.931 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.935 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.941 187156 INFO os_vif [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:d8:ce,bridge_name='br-int',has_traffic_filtering=True,id=541e9918-b113-477f-b173-6e0844275c91,network=Network(19f12258-2ca1-4bcd-90a1-babd862276cb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap541e9918-b1')#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.942 187156 INFO nova.virt.libvirt.driver [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Deleting instance files /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078_del#033[00m
Nov 29 02:40:16 np0005539504 nova_compute[187152]: 2025-11-29 07:40:16.943 187156 INFO nova.virt.libvirt.driver [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Deletion of /var/lib/nova/instances/37b4277c-4278-4837-a7fa-e4ef827f1078_del complete#033[00m
Nov 29 02:40:17 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57-userdata-shm.mount: Deactivated successfully.
Nov 29 02:40:17 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9d0f9edcd0e869e0776ba229916bb15c4d4a6e8dfe81afb766a9693ccf02ca88-merged.mount: Deactivated successfully.
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.142 187156 INFO nova.compute.manager [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.143 187156 DEBUG oslo.service.loopingcall [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.144 187156 DEBUG nova.compute.manager [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.144 187156 DEBUG nova.network.neutron [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:40:17 np0005539504 podman[245005]: 2025-11-29 07:40:17.166498781 +0000 UTC m=+0.316939968 container cleanup 0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:40:17 np0005539504 systemd[1]: libpod-conmon-0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57.scope: Deactivated successfully.
Nov 29 02:40:17 np0005539504 podman[245033]: 2025-11-29 07:40:17.381151419 +0000 UTC m=+0.185791276 container remove 0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.389 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c86a7f95-5496-49f3-9b26-7ec35072044e]: (4, ('Sat Nov 29 07:40:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb (0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57)\n0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57\nSat Nov 29 07:40:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb (0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57)\n0dd37ba374fcfbce8e06ecc5b9bb73246887ecec24db2879707153ef259c9d57\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.392 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[00195ad3-1984-4a36-b4cc-a638e3f6aa28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.394 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19f12258-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.396 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:17 np0005539504 kernel: tap19f12258-20: left promiscuous mode
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.413 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.422 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[de315573-e8f9-4517-a8d5-d2509bcc9478]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.444 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5370dd0a-d3ab-491e-8e19-c81d96d116d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.445 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c273f6b3-ce3e-4b1e-8020-2ccacc378d27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:17 np0005539504 nova_compute[187152]: 2025-11-29 07:40:17.456 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.468 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ed63cc92-842b-467f-815c-23ca560b4a98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735880, 'reachable_time': 26169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245049, 'error': None, 'target': 'ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:17 np0005539504 systemd[1]: run-netns-ovnmeta\x2d19f12258\x2d2ca1\x2d4bcd\x2d90a1\x2dbabd862276cb.mount: Deactivated successfully.
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.476 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19f12258-2ca1-4bcd-90a1-babd862276cb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:40:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:17.476 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[72ea37c5-8d5c-4a7e-8353-a4d30cd49eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:18 np0005539504 nova_compute[187152]: 2025-11-29 07:40:18.697 187156 DEBUG nova.network.neutron [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:40:18 np0005539504 nova_compute[187152]: 2025-11-29 07:40:18.984 187156 INFO nova.compute.manager [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Took 1.84 seconds to deallocate network for instance.#033[00m
Nov 29 02:40:18 np0005539504 nova_compute[187152]: 2025-11-29 07:40:18.989 187156 DEBUG nova.compute.manager [req-606b270a-6d2a-46ac-ab60-d0587e815c1f req-0a344e32-c84f-4d3c-af6a-a43d80ffc205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-vif-deleted-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:18 np0005539504 nova_compute[187152]: 2025-11-29 07:40:18.990 187156 INFO nova.compute.manager [req-606b270a-6d2a-46ac-ab60-d0587e815c1f req-0a344e32-c84f-4d3c-af6a-a43d80ffc205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Neutron deleted interface 541e9918-b113-477f-b173-6e0844275c91; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:40:18 np0005539504 nova_compute[187152]: 2025-11-29 07:40:18.991 187156 DEBUG nova.network.neutron [req-606b270a-6d2a-46ac-ab60-d0587e815c1f req-0a344e32-c84f-4d3c-af6a-a43d80ffc205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.090 187156 DEBUG nova.compute.manager [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-vif-unplugged-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.091 187156 DEBUG oslo_concurrency.lockutils [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.091 187156 DEBUG oslo_concurrency.lockutils [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.091 187156 DEBUG oslo_concurrency.lockutils [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.092 187156 DEBUG nova.compute.manager [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] No waiting events found dispatching network-vif-unplugged-541e9918-b113-477f-b173-6e0844275c91 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.092 187156 DEBUG nova.compute.manager [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-vif-unplugged-541e9918-b113-477f-b173-6e0844275c91 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.092 187156 DEBUG nova.compute.manager [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.092 187156 DEBUG oslo_concurrency.lockutils [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.093 187156 DEBUG oslo_concurrency.lockutils [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.093 187156 DEBUG oslo_concurrency.lockutils [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.093 187156 DEBUG nova.compute.manager [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] No waiting events found dispatching network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.093 187156 WARNING nova.compute.manager [req-8a12fbc3-3633-4490-9503-e1601b9a48ff req-ba48856e-3501-48f6-8a06-d8c47323c9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Received unexpected event network-vif-plugged-541e9918-b113-477f-b173-6e0844275c91 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.110 187156 DEBUG nova.compute.manager [req-606b270a-6d2a-46ac-ab60-d0587e815c1f req-0a344e32-c84f-4d3c-af6a-a43d80ffc205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Detach interface failed, port_id=541e9918-b113-477f-b173-6e0844275c91, reason: Instance 37b4277c-4278-4837-a7fa-e4ef827f1078 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.113 187156 DEBUG nova.network.neutron [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updated VIF entry in instance network info cache for port 541e9918-b113-477f-b173-6e0844275c91. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.114 187156 DEBUG nova.network.neutron [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Updating instance_info_cache with network_info: [{"id": "541e9918-b113-477f-b173-6e0844275c91", "address": "fa:16:3e:d4:d8:ce", "network": {"id": "19f12258-2ca1-4bcd-90a1-babd862276cb", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-875787961-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "66de5f0713944c28a20250b7fbccc130", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap541e9918-b1", "ovs_interfaceid": "541e9918-b113-477f-b173-6e0844275c91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.784 187156 DEBUG oslo_concurrency.lockutils [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-37b4277c-4278-4837-a7fa-e4ef827f1078" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.818 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.819 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.882 187156 DEBUG nova.compute.provider_tree [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.901 187156 DEBUG nova.scheduler.client.report [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.926 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:19 np0005539504 nova_compute[187152]: 2025-11-29 07:40:19.978 187156 INFO nova.scheduler.client.report [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Deleted allocations for instance 37b4277c-4278-4837-a7fa-e4ef827f1078#033[00m
Nov 29 02:40:20 np0005539504 nova_compute[187152]: 2025-11-29 07:40:20.085 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:20 np0005539504 nova_compute[187152]: 2025-11-29 07:40:20.118 187156 DEBUG oslo_concurrency.lockutils [None req-0fcb8fb0-fc0f-4516-b5d7-ac7e61e8d300 b921ad1454834bae9b706b9fa53948b3 66de5f0713944c28a20250b7fbccc130 - - default default] Lock "37b4277c-4278-4837-a7fa-e4ef827f1078" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:21 np0005539504 nova_compute[187152]: 2025-11-29 07:40:21.934 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:22 np0005539504 nova_compute[187152]: 2025-11-29 07:40:22.459 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.477 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.478 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.478 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.782 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8::f816:3eff:feac:f62b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2459b7bb-f6d0-4520-a009-14c9d4a2b794) old=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.785 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2459b7bb-f6d0-4520-a009-14c9d4a2b794 in datapath 600edac6-24aa-414f-b977-07c2890470f1 updated#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.788 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.789 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:23.791 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbf42dd-f827-4143-bf64-55bac443c933]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:25 np0005539504 nova_compute[187152]: 2025-11-29 07:40:25.221 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402010.2183638, 44fe8388-2086-4a84-9dce-66f7d619dea8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:40:25 np0005539504 nova_compute[187152]: 2025-11-29 07:40:25.222 187156 INFO nova.compute.manager [-] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:40:25 np0005539504 nova_compute[187152]: 2025-11-29 07:40:25.247 187156 DEBUG nova.compute.manager [None req-5883e466-816e-457e-87e7-619ce2e3dbbf - - - - - -] [instance: 44fe8388-2086-4a84-9dce-66f7d619dea8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:40:25 np0005539504 podman[245050]: 2025-11-29 07:40:25.762301341 +0000 UTC m=+0.099417920 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:40:25 np0005539504 nova_compute[187152]: 2025-11-29 07:40:25.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:25 np0005539504 nova_compute[187152]: 2025-11-29 07:40:25.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:40:26 np0005539504 nova_compute[187152]: 2025-11-29 07:40:26.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:26 np0005539504 nova_compute[187152]: 2025-11-29 07:40:26.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:26 np0005539504 nova_compute[187152]: 2025-11-29 07:40:26.969 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:26 np0005539504 nova_compute[187152]: 2025-11-29 07:40:26.970 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:26 np0005539504 nova_compute[187152]: 2025-11-29 07:40:26.970 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:26 np0005539504 nova_compute[187152]: 2025-11-29 07:40:26.970 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.170 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.172 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5713MB free_disk=73.00725555419922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.172 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.173 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.269 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.270 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.299 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.313 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.340 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.341 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:27 np0005539504 nova_compute[187152]: 2025-11-29 07:40:27.462 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:28 np0005539504 nova_compute[187152]: 2025-11-29 07:40:28.341 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:28 np0005539504 nova_compute[187152]: 2025-11-29 07:40:28.342 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:40:28 np0005539504 nova_compute[187152]: 2025-11-29 07:40:28.342 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:40:28 np0005539504 nova_compute[187152]: 2025-11-29 07:40:28.359 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:40:28 np0005539504 nova_compute[187152]: 2025-11-29 07:40:28.360 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:28 np0005539504 nova_compute[187152]: 2025-11-29 07:40:28.360 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:31 np0005539504 nova_compute[187152]: 2025-11-29 07:40:31.830 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402016.829575, 37b4277c-4278-4837-a7fa-e4ef827f1078 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:40:31 np0005539504 nova_compute[187152]: 2025-11-29 07:40:31.831 187156 INFO nova.compute.manager [-] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:40:31 np0005539504 nova_compute[187152]: 2025-11-29 07:40:31.849 187156 DEBUG nova.compute.manager [None req-1f913c82-c476-4fd8-ba11-713705a03fb6 - - - - - -] [instance: 37b4277c-4278-4837-a7fa-e4ef827f1078] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:40:31 np0005539504 nova_compute[187152]: 2025-11-29 07:40:31.962 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:32 np0005539504 nova_compute[187152]: 2025-11-29 07:40:32.134 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:32 np0005539504 nova_compute[187152]: 2025-11-29 07:40:32.395 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:32 np0005539504 nova_compute[187152]: 2025-11-29 07:40:32.463 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:32 np0005539504 podman[245074]: 2025-11-29 07:40:32.743828075 +0000 UTC m=+0.071735329 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:40:36 np0005539504 nova_compute[187152]: 2025-11-29 07:40:36.966 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:37.176 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8:0:1:f816:3eff:feac:f62b 2001:db8::f816:3eff:feac:f62b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:feac:f62b/64 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2459b7bb-f6d0-4520-a009-14c9d4a2b794) old=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8::f816:3eff:feac:f62b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:37.177 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2459b7bb-f6d0-4520-a009-14c9d4a2b794 in datapath 600edac6-24aa-414f-b977-07c2890470f1 updated#033[00m
Nov 29 02:40:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:37.178 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:40:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:37.179 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b67fa7b3-0a55-4ddc-9b0c-2f38ee003089]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:37 np0005539504 nova_compute[187152]: 2025-11-29 07:40:37.464 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:37 np0005539504 nova_compute[187152]: 2025-11-29 07:40:37.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:41 np0005539504 nova_compute[187152]: 2025-11-29 07:40:41.971 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:42 np0005539504 nova_compute[187152]: 2025-11-29 07:40:42.468 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:43 np0005539504 podman[245096]: 2025-11-29 07:40:43.73148373 +0000 UTC m=+0.064958084 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:40:43 np0005539504 podman[245098]: 2025-11-29 07:40:43.739199317 +0000 UTC m=+0.064889252 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 29 02:40:43 np0005539504 podman[245097]: 2025-11-29 07:40:43.739799963 +0000 UTC m=+0.070039670 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git)
Nov 29 02:40:43 np0005539504 nova_compute[187152]: 2025-11-29 07:40:43.935 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:46 np0005539504 nova_compute[187152]: 2025-11-29 07:40:46.979 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:47 np0005539504 nova_compute[187152]: 2025-11-29 07:40:47.471 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:47 np0005539504 podman[245162]: 2025-11-29 07:40:47.723827095 +0000 UTC m=+0.060731510 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:40:47 np0005539504 podman[245163]: 2025-11-29 07:40:47.799500345 +0000 UTC m=+0.131639913 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:40:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.486 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.487 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.521 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.903 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.904 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.912 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:40:48 np0005539504 nova_compute[187152]: 2025-11-29 07:40:48.913 187156 INFO nova.compute.claims [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.196 187156 DEBUG nova.scheduler.client.report [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.229 187156 DEBUG nova.scheduler.client.report [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.229 187156 DEBUG nova.compute.provider_tree [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.263 187156 DEBUG nova.scheduler.client.report [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.289 187156 DEBUG nova.scheduler.client.report [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.354 187156 DEBUG nova.compute.provider_tree [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.383 187156 DEBUG nova.scheduler.client.report [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.421 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.422 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.550 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.551 187156 DEBUG nova.network.neutron [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.580 187156 INFO nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.606 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.824 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.826 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.827 187156 INFO nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Creating image(s)#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.828 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.828 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.829 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.850 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.905 187156 DEBUG nova.policy [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.919 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.919 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.920 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.932 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.995 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:49 np0005539504 nova_compute[187152]: 2025-11-29 07:40:49.997 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.036 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.037 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.038 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.101 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.102 187156 DEBUG nova.virt.disk.api [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.103 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.165 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.167 187156 DEBUG nova.virt.disk.api [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.167 187156 DEBUG nova.objects.instance [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid eddd5f9b-2b98-4c39-9b31-21ef5bfe464a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.197 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.198 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Ensure instance console log exists: /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.198 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.198 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:50 np0005539504 nova_compute[187152]: 2025-11-29 07:40:50.199 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:51 np0005539504 nova_compute[187152]: 2025-11-29 07:40:51.986 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:52 np0005539504 nova_compute[187152]: 2025-11-29 07:40:52.473 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:52.644 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:52 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:40:52.645 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:40:52 np0005539504 nova_compute[187152]: 2025-11-29 07:40:52.645 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:53 np0005539504 nova_compute[187152]: 2025-11-29 07:40:53.518 187156 DEBUG nova.network.neutron [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Successfully created port: da9acc55-51b1-44ba-b281-8871c07a7c33 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.401 187156 DEBUG nova.network.neutron [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Successfully updated port: da9acc55-51b1-44ba-b281-8871c07a7c33 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.484 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.484 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.485 187156 DEBUG nova.network.neutron [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.550 187156 DEBUG nova.compute.manager [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-changed-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.550 187156 DEBUG nova.compute.manager [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Refreshing instance network info cache due to event network-changed-da9acc55-51b1-44ba-b281-8871c07a7c33. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.551 187156 DEBUG oslo_concurrency.lockutils [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:40:55 np0005539504 nova_compute[187152]: 2025-11-29 07:40:55.739 187156 DEBUG nova.network.neutron [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:40:56 np0005539504 podman[245229]: 2025-11-29 07:40:56.719503459 +0000 UTC m=+0.061499091 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:40:56 np0005539504 nova_compute[187152]: 2025-11-29 07:40:56.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:57 np0005539504 nova_compute[187152]: 2025-11-29 07:40:57.475 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:01.649 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.704 187156 DEBUG nova.network.neutron [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.742 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.742 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Instance network_info: |[{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.743 187156 DEBUG oslo_concurrency.lockutils [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.743 187156 DEBUG nova.network.neutron [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Refreshing network info cache for port da9acc55-51b1-44ba-b281-8871c07a7c33 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.746 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Start _get_guest_xml network_info=[{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.753 187156 WARNING nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.767 187156 DEBUG nova.virt.libvirt.host [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.768 187156 DEBUG nova.virt.libvirt.host [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.772 187156 DEBUG nova.virt.libvirt.host [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.773 187156 DEBUG nova.virt.libvirt.host [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.775 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.775 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.776 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.776 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.777 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.777 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.777 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.778 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.778 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.778 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.778 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.779 187156 DEBUG nova.virt.hardware [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.785 187156 DEBUG nova.virt.libvirt.vif [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:40:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1725294125',display_name='tempest-TestGettingAddress-server-1725294125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1725294125',id=158,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3d0ZsvCuIHNmZM7lmf14lwcU9LYA+YzS+DsUoU/RUNt3FNYs43WlwoA0reTsUUFEVQa4lagWavf3wAARjW0IrVdX6QLhMZ1dtoKB8yeTuH2S9PjazhePg7oe9bdCIqjQ==',key_name='tempest-TestGettingAddress-1639164498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-7wudwqxz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:40:49Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=eddd5f9b-2b98-4c39-9b31-21ef5bfe464a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.785 187156 DEBUG nova.network.os_vif_util [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.787 187156 DEBUG nova.network.os_vif_util [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.788 187156 DEBUG nova.objects.instance [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid eddd5f9b-2b98-4c39-9b31-21ef5bfe464a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.816 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <uuid>eddd5f9b-2b98-4c39-9b31-21ef5bfe464a</uuid>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <name>instance-0000009e</name>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestGettingAddress-server-1725294125</nova:name>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:41:01</nova:creationTime>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        <nova:port uuid="da9acc55-51b1-44ba-b281-8871c07a7c33">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe27:aa27" ipVersion="6"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe27:aa27" ipVersion="6"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <entry name="serial">eddd5f9b-2b98-4c39-9b31-21ef5bfe464a</entry>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <entry name="uuid">eddd5f9b-2b98-4c39-9b31-21ef5bfe464a</entry>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.config"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:27:aa:27"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <target dev="tapda9acc55-51"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/console.log" append="off"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:41:01 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:41:01 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:41:01 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:41:01 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.818 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Preparing to wait for external event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.819 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.819 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.819 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.820 187156 DEBUG nova.virt.libvirt.vif [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:40:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1725294125',display_name='tempest-TestGettingAddress-server-1725294125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1725294125',id=158,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3d0ZsvCuIHNmZM7lmf14lwcU9LYA+YzS+DsUoU/RUNt3FNYs43WlwoA0reTsUUFEVQa4lagWavf3wAARjW0IrVdX6QLhMZ1dtoKB8yeTuH2S9PjazhePg7oe9bdCIqjQ==',key_name='tempest-TestGettingAddress-1639164498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-7wudwqxz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:40:49Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=eddd5f9b-2b98-4c39-9b31-21ef5bfe464a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.821 187156 DEBUG nova.network.os_vif_util [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.822 187156 DEBUG nova.network.os_vif_util [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.822 187156 DEBUG os_vif [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.823 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.824 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.824 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.832 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.833 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda9acc55-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.834 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda9acc55-51, col_values=(('external_ids', {'iface-id': 'da9acc55-51b1-44ba-b281-8871c07a7c33', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:aa:27', 'vm-uuid': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.836 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:01 np0005539504 NetworkManager[55210]: <info>  [1764402061.8384] manager: (tapda9acc55-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.839 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.848 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.849 187156 INFO os_vif [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51')#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.941 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.942 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.942 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:27:aa:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:41:01 np0005539504 nova_compute[187152]: 2025-11-29 07:41:01.943 187156 INFO nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Using config drive#033[00m
Nov 29 02:41:02 np0005539504 nova_compute[187152]: 2025-11-29 07:41:02.479 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:02 np0005539504 nova_compute[187152]: 2025-11-29 07:41:02.918 187156 INFO nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Creating config drive at /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.config#033[00m
Nov 29 02:41:02 np0005539504 nova_compute[187152]: 2025-11-29 07:41:02.924 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqwfd__s9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:03 np0005539504 nova_compute[187152]: 2025-11-29 07:41:03.054 187156 DEBUG oslo_concurrency.processutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqwfd__s9" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:03 np0005539504 kernel: tapda9acc55-51: entered promiscuous mode
Nov 29 02:41:03 np0005539504 NetworkManager[55210]: <info>  [1764402063.1661] manager: (tapda9acc55-51): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Nov 29 02:41:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:03Z|00642|binding|INFO|Claiming lport da9acc55-51b1-44ba-b281-8871c07a7c33 for this chassis.
Nov 29 02:41:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:03Z|00643|binding|INFO|da9acc55-51b1-44ba-b281-8871c07a7c33: Claiming fa:16:3e:27:aa:27 10.100.0.6 2001:db8:0:1:f816:3eff:fe27:aa27 2001:db8::f816:3eff:fe27:aa27
Nov 29 02:41:03 np0005539504 nova_compute[187152]: 2025-11-29 07:41:03.167 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:03 np0005539504 nova_compute[187152]: 2025-11-29 07:41:03.172 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:03 np0005539504 nova_compute[187152]: 2025-11-29 07:41:03.177 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:03 np0005539504 systemd-udevd[245279]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:41:03 np0005539504 systemd-machined[153423]: New machine qemu-83-instance-0000009e.
Nov 29 02:41:03 np0005539504 NetworkManager[55210]: <info>  [1764402063.2181] device (tapda9acc55-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:41:03 np0005539504 NetworkManager[55210]: <info>  [1764402063.2194] device (tapda9acc55-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:41:03 np0005539504 nova_compute[187152]: 2025-11-29 07:41:03.230 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:03Z|00644|binding|INFO|Setting lport da9acc55-51b1-44ba-b281-8871c07a7c33 ovn-installed in OVS
Nov 29 02:41:03 np0005539504 nova_compute[187152]: 2025-11-29 07:41:03.235 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:03 np0005539504 systemd[1]: Started Virtual Machine qemu-83-instance-0000009e.
Nov 29 02:41:03 np0005539504 podman[245262]: 2025-11-29 07:41:03.262028885 +0000 UTC m=+0.104790702 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 29 02:41:04 np0005539504 nova_compute[187152]: 2025-11-29 07:41:04.239 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402064.2382889, eddd5f9b-2b98-4c39-9b31-21ef5bfe464a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:41:04 np0005539504 nova_compute[187152]: 2025-11-29 07:41:04.240 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] VM Started (Lifecycle Event)#033[00m
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.551 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.557 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402064.2387867, eddd5f9b-2b98-4c39-9b31-21ef5bfe464a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.557 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:41:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:05Z|00645|binding|INFO|Setting lport da9acc55-51b1-44ba-b281-8871c07a7c33 up in Southbound
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.559 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:aa:27 10.100.0.6 2001:db8:0:1:f816:3eff:fe27:aa27 2001:db8::f816:3eff:fe27:aa27'], port_security=['fa:16:3e:27:aa:27 10.100.0.6 2001:db8:0:1:f816:3eff:fe27:aa27 2001:db8::f816:3eff:fe27:aa27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8:0:1:f816:3eff:fe27:aa27/64 2001:db8::f816:3eff:fe27:aa27/64', 'neutron:device_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e2af89ad-a80e-4dc1-aa45-ab6ce3534b4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=da9acc55-51b1-44ba-b281-8871c07a7c33) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.561 104164 INFO neutron.agent.ovn.metadata.agent [-] Port da9acc55-51b1-44ba-b281-8871c07a7c33 in datapath 600edac6-24aa-414f-b977-07c2890470f1 bound to our chassis#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.562 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 600edac6-24aa-414f-b977-07c2890470f1#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.576 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[be7d1478-8cd7-45cd-bdfd-3e9cf591991b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.577 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap600edac6-21 in ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.581 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap600edac6-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.581 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5656dc6e-b084-48b0-880f-6a6e7e14dc0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.582 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[769b4c1b-0cbf-4efe-ba34-0a3df9fefe74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.596 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[fd61316c-8570-460e-90d1-ce6f37a9404d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.627 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd944b6-9370-47c0-a26b-8a369a1a099e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.672 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ba79f711-34f1-4867-938f-02e5804e3f4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 NetworkManager[55210]: <info>  [1764402065.6800] manager: (tap600edac6-20): new Veth device (/org/freedesktop/NetworkManager/Devices/284)
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.679 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f11cfefa-4f49-4b11-9b2e-68cb6c90f74a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 systemd-udevd[245283]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.687 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.695 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.714 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b54c2e15-05da-4dfa-badc-d1394f47f59c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.718 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2c89be1d-0911-4e1b-b027-db2b88e7dad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 NetworkManager[55210]: <info>  [1764402065.7412] device (tap600edac6-20): carrier: link connected
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.747 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[26cdb005-e5c2-4d66-b1e2-9db6e0523960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.766 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[77967824-a9b8-4b52-8a01-b136b320db23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap600edac6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:f6:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 752862, 'reachable_time': 15025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245329, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.785 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[be6b1771-b61f-4f99-bca2-09bdfb7d2b5d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feac:f62b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 752862, 'tstamp': 752862}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245330, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.804 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0b72744e-ab99-4903-a5c6-c2ea1a51e9ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap600edac6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:f6:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 752862, 'reachable_time': 15025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245331, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.842 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f2866c56-e4f1-40db-8ba9-067543bd2d66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.906 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d3474e0e-6e5b-4e38-912e-6d6d5a7d7888]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.907 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap600edac6-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.908 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.908 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap600edac6-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:05 np0005539504 NetworkManager[55210]: <info>  [1764402065.9112] manager: (tap600edac6-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Nov 29 02:41:05 np0005539504 kernel: tap600edac6-20: entered promiscuous mode
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.914 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap600edac6-20, col_values=(('external_ids', {'iface-id': '2459b7bb-f6d0-4520-a009-14c9d4a2b794'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:05Z|00646|binding|INFO|Releasing lport 2459b7bb-f6d0-4520-a009-14c9d4a2b794 from this chassis (sb_readonly=0)
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.935 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:05 np0005539504 nova_compute[187152]: 2025-11-29 07:41:05.953 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.954 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/600edac6-24aa-414f-b977-07c2890470f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/600edac6-24aa-414f-b977-07c2890470f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.955 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e17d5ec1-2b9a-4294-8137-fc53c5073462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.957 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-600edac6-24aa-414f-b977-07c2890470f1
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/600edac6-24aa-414f-b977-07c2890470f1.pid.haproxy
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 600edac6-24aa-414f-b977-07c2890470f1
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:41:05 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:05.957 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'env', 'PROCESS_TAG=haproxy-600edac6-24aa-414f-b977-07c2890470f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/600edac6-24aa-414f-b977-07c2890470f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:41:06 np0005539504 nova_compute[187152]: 2025-11-29 07:41:06.069 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:41:06 np0005539504 podman[245363]: 2025-11-29 07:41:06.324011427 +0000 UTC m=+0.055299434 container create 3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:41:06 np0005539504 systemd[1]: Started libpod-conmon-3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f.scope.
Nov 29 02:41:06 np0005539504 podman[245363]: 2025-11-29 07:41:06.291772462 +0000 UTC m=+0.023060489 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:41:06 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:41:06 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2cfba1bf143536a5018c5f4c5e2519be9e0036642477f233cae602f240298f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:41:06 np0005539504 podman[245363]: 2025-11-29 07:41:06.416783757 +0000 UTC m=+0.148071764 container init 3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 02:41:06 np0005539504 podman[245363]: 2025-11-29 07:41:06.422975423 +0000 UTC m=+0.154263430 container start 3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:41:06 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [NOTICE]   (245383) : New worker (245385) forked
Nov 29 02:41:06 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [NOTICE]   (245383) : Loading success.
Nov 29 02:41:06 np0005539504 nova_compute[187152]: 2025-11-29 07:41:06.837 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:07 np0005539504 nova_compute[187152]: 2025-11-29 07:41:07.481 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:09 np0005539504 nova_compute[187152]: 2025-11-29 07:41:09.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.488 187156 DEBUG nova.compute.manager [req-e8f24b60-e3f6-4a6a-91ae-95e94e6de486 req-8a801861-1ef6-4bb3-9d62-f830e5a536b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.488 187156 DEBUG oslo_concurrency.lockutils [req-e8f24b60-e3f6-4a6a-91ae-95e94e6de486 req-8a801861-1ef6-4bb3-9d62-f830e5a536b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.489 187156 DEBUG oslo_concurrency.lockutils [req-e8f24b60-e3f6-4a6a-91ae-95e94e6de486 req-8a801861-1ef6-4bb3-9d62-f830e5a536b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.489 187156 DEBUG oslo_concurrency.lockutils [req-e8f24b60-e3f6-4a6a-91ae-95e94e6de486 req-8a801861-1ef6-4bb3-9d62-f830e5a536b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.489 187156 DEBUG nova.compute.manager [req-e8f24b60-e3f6-4a6a-91ae-95e94e6de486 req-8a801861-1ef6-4bb3-9d62-f830e5a536b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Processing event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.490 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.495 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402070.4950483, eddd5f9b-2b98-4c39-9b31-21ef5bfe464a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.495 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.497 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.500 187156 INFO nova.virt.libvirt.driver [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Instance spawned successfully.#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.500 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.548 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.550 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.551 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.552 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.552 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.552 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.553 187156 DEBUG nova.virt.libvirt.driver [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.559 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.621 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.729 187156 INFO nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Took 20.90 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:41:10 np0005539504 nova_compute[187152]: 2025-11-29 07:41:10.729 187156 DEBUG nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:11 np0005539504 nova_compute[187152]: 2025-11-29 07:41:11.280 187156 INFO nova.compute.manager [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Took 22.45 seconds to build instance.#033[00m
Nov 29 02:41:11 np0005539504 nova_compute[187152]: 2025-11-29 07:41:11.323 187156 DEBUG oslo_concurrency.lockutils [None req-2cce9fd1-e5c8-4b26-9aa4-712f44bce003 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:11 np0005539504 nova_compute[187152]: 2025-11-29 07:41:11.841 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.482 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.762 187156 DEBUG nova.compute.manager [req-a54ce6ff-d87d-419b-abc3-f571d50840d9 req-7eabe41e-1480-47ce-bf59-0afbe445d478 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.763 187156 DEBUG oslo_concurrency.lockutils [req-a54ce6ff-d87d-419b-abc3-f571d50840d9 req-7eabe41e-1480-47ce-bf59-0afbe445d478 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.764 187156 DEBUG oslo_concurrency.lockutils [req-a54ce6ff-d87d-419b-abc3-f571d50840d9 req-7eabe41e-1480-47ce-bf59-0afbe445d478 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.764 187156 DEBUG oslo_concurrency.lockutils [req-a54ce6ff-d87d-419b-abc3-f571d50840d9 req-7eabe41e-1480-47ce-bf59-0afbe445d478 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.765 187156 DEBUG nova.compute.manager [req-a54ce6ff-d87d-419b-abc3-f571d50840d9 req-7eabe41e-1480-47ce-bf59-0afbe445d478 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] No waiting events found dispatching network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:41:12 np0005539504 nova_compute[187152]: 2025-11-29 07:41:12.765 187156 WARNING nova.compute.manager [req-a54ce6ff-d87d-419b-abc3-f571d50840d9 req-7eabe41e-1480-47ce-bf59-0afbe445d478 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received unexpected event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:41:13 np0005539504 nova_compute[187152]: 2025-11-29 07:41:13.513 187156 DEBUG nova.network.neutron [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updated VIF entry in instance network info cache for port da9acc55-51b1-44ba-b281-8871c07a7c33. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:41:13 np0005539504 nova_compute[187152]: 2025-11-29 07:41:13.514 187156 DEBUG nova.network.neutron [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:13 np0005539504 nova_compute[187152]: 2025-11-29 07:41:13.776 187156 DEBUG oslo_concurrency.lockutils [req-be4842a4-0bac-42e1-b200-dafcb4ee48fb req-f69160ae-9567-4f4f-9638-292f91a9dd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:13 np0005539504 nova_compute[187152]: 2025-11-29 07:41:13.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:14 np0005539504 podman[245396]: 2025-11-29 07:41:14.742832684 +0000 UTC m=+0.065727245 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:41:14 np0005539504 podman[245394]: 2025-11-29 07:41:14.743275606 +0000 UTC m=+0.066183117 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:41:14 np0005539504 podman[245395]: 2025-11-29 07:41:14.751100796 +0000 UTC m=+0.073991077 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public)
Nov 29 02:41:16 np0005539504 nova_compute[187152]: 2025-11-29 07:41:16.844 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:17 np0005539504 nova_compute[187152]: 2025-11-29 07:41:17.484 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:17 np0005539504 nova_compute[187152]: 2025-11-29 07:41:17.676 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:17 np0005539504 NetworkManager[55210]: <info>  [1764402077.6777] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Nov 29 02:41:17 np0005539504 NetworkManager[55210]: <info>  [1764402077.6792] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Nov 29 02:41:17 np0005539504 nova_compute[187152]: 2025-11-29 07:41:17.820 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:17Z|00647|binding|INFO|Releasing lport 2459b7bb-f6d0-4520-a009-14c9d4a2b794 from this chassis (sb_readonly=0)
Nov 29 02:41:17 np0005539504 nova_compute[187152]: 2025-11-29 07:41:17.840 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:18 np0005539504 podman[245455]: 2025-11-29 07:41:18.730950786 +0000 UTC m=+0.056783495 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:41:18 np0005539504 podman[245456]: 2025-11-29 07:41:18.794285545 +0000 UTC m=+0.115416038 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:41:19 np0005539504 nova_compute[187152]: 2025-11-29 07:41:19.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:20 np0005539504 nova_compute[187152]: 2025-11-29 07:41:20.710 187156 DEBUG nova.compute.manager [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-changed-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:20 np0005539504 nova_compute[187152]: 2025-11-29 07:41:20.710 187156 DEBUG nova.compute.manager [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Refreshing instance network info cache due to event network-changed-da9acc55-51b1-44ba-b281-8871c07a7c33. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:41:20 np0005539504 nova_compute[187152]: 2025-11-29 07:41:20.711 187156 DEBUG oslo_concurrency.lockutils [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:41:20 np0005539504 nova_compute[187152]: 2025-11-29 07:41:20.711 187156 DEBUG oslo_concurrency.lockutils [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:20 np0005539504 nova_compute[187152]: 2025-11-29 07:41:20.711 187156 DEBUG nova.network.neutron [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Refreshing network info cache for port da9acc55-51b1-44ba-b281-8871c07a7c33 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:41:21 np0005539504 nova_compute[187152]: 2025-11-29 07:41:21.847 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:22 np0005539504 nova_compute[187152]: 2025-11-29 07:41:22.487 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:23.478 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:23.480 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:23.481 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:25Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:aa:27 10.100.0.6
Nov 29 02:41:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:41:25Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:aa:27 10.100.0.6
Nov 29 02:41:25 np0005539504 nova_compute[187152]: 2025-11-29 07:41:25.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:25 np0005539504 nova_compute[187152]: 2025-11-29 07:41:25.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:41:26 np0005539504 nova_compute[187152]: 2025-11-29 07:41:26.778 187156 DEBUG nova.network.neutron [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updated VIF entry in instance network info cache for port da9acc55-51b1-44ba-b281-8871c07a7c33. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:41:26 np0005539504 nova_compute[187152]: 2025-11-29 07:41:26.780 187156 DEBUG nova.network.neutron [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:26 np0005539504 nova_compute[187152]: 2025-11-29 07:41:26.850 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:26 np0005539504 nova_compute[187152]: 2025-11-29 07:41:26.874 187156 DEBUG oslo_concurrency.lockutils [req-2ff3bc54-eb61-4df8-a1aa-b8e48a64b83d req-2a28bf4b-419a-4fc7-a581-5f96931d39a9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:27 np0005539504 nova_compute[187152]: 2025-11-29 07:41:27.490 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:27 np0005539504 podman[245520]: 2025-11-29 07:41:27.738428278 +0000 UTC m=+0.078549239 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:41:27 np0005539504 nova_compute[187152]: 2025-11-29 07:41:27.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:28 np0005539504 nova_compute[187152]: 2025-11-29 07:41:28.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:28 np0005539504 nova_compute[187152]: 2025-11-29 07:41:28.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:41:28 np0005539504 nova_compute[187152]: 2025-11-29 07:41:28.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:41:30 np0005539504 nova_compute[187152]: 2025-11-29 07:41:30.466 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:41:30 np0005539504 nova_compute[187152]: 2025-11-29 07:41:30.467 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:30 np0005539504 nova_compute[187152]: 2025-11-29 07:41:30.467 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:41:30 np0005539504 nova_compute[187152]: 2025-11-29 07:41:30.467 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid eddd5f9b-2b98-4c39-9b31-21ef5bfe464a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:41:31 np0005539504 nova_compute[187152]: 2025-11-29 07:41:31.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:32 np0005539504 nova_compute[187152]: 2025-11-29 07:41:32.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:33 np0005539504 podman[245540]: 2025-11-29 07:41:33.726035014 +0000 UTC m=+0.067614904 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:41:36 np0005539504 nova_compute[187152]: 2025-11-29 07:41:36.856 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:37 np0005539504 nova_compute[187152]: 2025-11-29 07:41:37.495 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.597 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.645 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.645 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.646 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.646 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.646 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.679 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.680 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.681 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.681 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.763 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.835 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.836 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:38 np0005539504 nova_compute[187152]: 2025-11-29 07:41:38.897 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.049 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.051 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5522MB free_disk=72.97785949707031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.051 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.051 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.206 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance eddd5f9b-2b98-4c39-9b31-21ef5bfe464a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.206 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.207 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.346 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.363 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.470 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:41:39 np0005539504 nova_compute[187152]: 2025-11-29 07:41:39.471 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:40 np0005539504 nova_compute[187152]: 2025-11-29 07:41:40.762 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:41 np0005539504 nova_compute[187152]: 2025-11-29 07:41:41.859 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:42 np0005539504 nova_compute[187152]: 2025-11-29 07:41:42.496 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:45 np0005539504 podman[245569]: 2025-11-29 07:41:45.710604372 +0000 UTC m=+0.045016739 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:41:45 np0005539504 podman[245568]: 2025-11-29 07:41:45.724206737 +0000 UTC m=+0.061454670 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 29 02:41:45 np0005539504 podman[245567]: 2025-11-29 07:41:45.740150404 +0000 UTC m=+0.080434549 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:41:46 np0005539504 nova_compute[187152]: 2025-11-29 07:41:46.862 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:47 np0005539504 nova_compute[187152]: 2025-11-29 07:41:47.498 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:49 np0005539504 podman[245626]: 2025-11-29 07:41:49.727931277 +0000 UTC m=+0.061056080 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:41:49 np0005539504 podman[245627]: 2025-11-29 07:41:49.78844422 +0000 UTC m=+0.111125432 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:41:51 np0005539504 nova_compute[187152]: 2025-11-29 07:41:51.864 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:52 np0005539504 nova_compute[187152]: 2025-11-29 07:41:52.500 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:56.350 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:41:56 np0005539504 nova_compute[187152]: 2025-11-29 07:41:56.351 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:41:56.352 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:41:56 np0005539504 nova_compute[187152]: 2025-11-29 07:41:56.896 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:57 np0005539504 nova_compute[187152]: 2025-11-29 07:41:57.503 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:58 np0005539504 podman[245679]: 2025-11-29 07:41:58.723636451 +0000 UTC m=+0.067056620 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 02:42:00 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:00.355 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:01 np0005539504 nova_compute[187152]: 2025-11-29 07:42:01.899 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:02 np0005539504 nova_compute[187152]: 2025-11-29 07:42:02.505 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:04 np0005539504 podman[245701]: 2025-11-29 07:42:04.711313451 +0000 UTC m=+0.056814625 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:42:06 np0005539504 nova_compute[187152]: 2025-11-29 07:42:06.901 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:07 np0005539504 nova_compute[187152]: 2025-11-29 07:42:07.557 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:10 np0005539504 nova_compute[187152]: 2025-11-29 07:42:10.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:11 np0005539504 nova_compute[187152]: 2025-11-29 07:42:11.904 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:12 np0005539504 nova_compute[187152]: 2025-11-29 07:42:12.561 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:15 np0005539504 nova_compute[187152]: 2025-11-29 07:42:15.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:16 np0005539504 podman[245723]: 2025-11-29 07:42:16.71006822 +0000 UTC m=+0.052883630 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:42:16 np0005539504 podman[245724]: 2025-11-29 07:42:16.726353537 +0000 UTC m=+0.063706031 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter)
Nov 29 02:42:16 np0005539504 podman[245725]: 2025-11-29 07:42:16.742874979 +0000 UTC m=+0.076274396 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:42:16 np0005539504 nova_compute[187152]: 2025-11-29 07:42:16.906 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:17 np0005539504 nova_compute[187152]: 2025-11-29 07:42:17.564 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:19 np0005539504 nova_compute[187152]: 2025-11-29 07:42:19.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:20 np0005539504 podman[245786]: 2025-11-29 07:42:20.714620813 +0000 UTC m=+0.054102133 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:42:20 np0005539504 podman[245787]: 2025-11-29 07:42:20.759401464 +0000 UTC m=+0.091929508 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:42:21 np0005539504 nova_compute[187152]: 2025-11-29 07:42:21.909 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:22 np0005539504 nova_compute[187152]: 2025-11-29 07:42:22.567 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:23.480 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:23.480 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:23.481 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:25 np0005539504 nova_compute[187152]: 2025-11-29 07:42:25.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:25 np0005539504 nova_compute[187152]: 2025-11-29 07:42:25.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:42:26 np0005539504 nova_compute[187152]: 2025-11-29 07:42:26.912 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:27 np0005539504 nova_compute[187152]: 2025-11-29 07:42:27.570 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:27 np0005539504 nova_compute[187152]: 2025-11-29 07:42:27.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:29 np0005539504 podman[245851]: 2025-11-29 07:42:29.733803988 +0000 UTC m=+0.075414624 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2)
Nov 29 02:42:29 np0005539504 nova_compute[187152]: 2025-11-29 07:42:29.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:29 np0005539504 nova_compute[187152]: 2025-11-29 07:42:29.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:42:29 np0005539504 nova_compute[187152]: 2025-11-29 07:42:29.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:42:30 np0005539504 nova_compute[187152]: 2025-11-29 07:42:30.528 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:30 np0005539504 nova_compute[187152]: 2025-11-29 07:42:30.529 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:30 np0005539504 nova_compute[187152]: 2025-11-29 07:42:30.529 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:42:30 np0005539504 nova_compute[187152]: 2025-11-29 07:42:30.529 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid eddd5f9b-2b98-4c39-9b31-21ef5bfe464a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:31 np0005539504 nova_compute[187152]: 2025-11-29 07:42:31.915 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:32 np0005539504 nova_compute[187152]: 2025-11-29 07:42:32.573 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.572 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.595 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.595 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.596 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.596 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.621 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.622 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.622 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.622 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.687 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.779 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.781 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.844 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.994 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.995 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5552MB free_disk=72.97831726074219GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.995 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:33 np0005539504 nova_compute[187152]: 2025-11-29 07:42:33.996 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.093 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance eddd5f9b-2b98-4c39-9b31-21ef5bfe464a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.094 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.094 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.142 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.159 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.161 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.161 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.857 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.858 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.875 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.969 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.970 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.987 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:42:34 np0005539504 nova_compute[187152]: 2025-11-29 07:42:34.988 187156 INFO nova.compute.claims [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.138 187156 DEBUG nova.compute.provider_tree [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.154 187156 DEBUG nova.scheduler.client.report [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.179 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.180 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.246 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.247 187156 DEBUG nova.network.neutron [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.264 187156 INFO nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.281 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.394 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.396 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.396 187156 INFO nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Creating image(s)#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.397 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.398 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.398 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.412 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.474 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.476 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.477 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.492 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.553 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.555 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.630 187156 DEBUG nova.policy [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.634 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk 1073741824" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.635 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.636 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.699 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.701 187156 DEBUG nova.virt.disk.api [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.701 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:35 np0005539504 podman[245886]: 2025-11-29 07:42:35.730194652 +0000 UTC m=+0.062550008 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.776 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.778 187156 DEBUG nova.virt.disk.api [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.778 187156 DEBUG nova.objects.instance [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.796 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.797 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Ensure instance console log exists: /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.797 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.798 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:35 np0005539504 nova_compute[187152]: 2025-11-29 07:42:35.798 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:36 np0005539504 nova_compute[187152]: 2025-11-29 07:42:36.918 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:37 np0005539504 nova_compute[187152]: 2025-11-29 07:42:37.575 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:38 np0005539504 nova_compute[187152]: 2025-11-29 07:42:38.348 187156 DEBUG nova.network.neutron [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Successfully created port: c7326683-ae94-4050-a985-768563f895f1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:42:38 np0005539504 nova_compute[187152]: 2025-11-29 07:42:38.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.376 187156 DEBUG nova.network.neutron [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Successfully updated port: c7326683-ae94-4050-a985-768563f895f1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.403 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.405 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.405 187156 DEBUG nova.network.neutron [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.506 187156 DEBUG nova.compute.manager [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-changed-c7326683-ae94-4050-a985-768563f895f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.507 187156 DEBUG nova.compute.manager [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Refreshing instance network info cache due to event network-changed-c7326683-ae94-4050-a985-768563f895f1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:42:39 np0005539504 nova_compute[187152]: 2025-11-29 07:42:39.507 187156 DEBUG oslo_concurrency.lockutils [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:40 np0005539504 nova_compute[187152]: 2025-11-29 07:42:40.570 187156 DEBUG nova.network.neutron [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:42:40 np0005539504 nova_compute[187152]: 2025-11-29 07:42:40.955 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:41 np0005539504 nova_compute[187152]: 2025-11-29 07:42:41.921 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:42 np0005539504 nova_compute[187152]: 2025-11-29 07:42:42.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.559 187156 DEBUG nova.network.neutron [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updating instance_info_cache with network_info: [{"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.884 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.884 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Instance network_info: |[{"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.885 187156 DEBUG oslo_concurrency.lockutils [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.885 187156 DEBUG nova.network.neutron [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Refreshing network info cache for port c7326683-ae94-4050-a985-768563f895f1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.888 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Start _get_guest_xml network_info=[{"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.895 187156 WARNING nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.900 187156 DEBUG nova.virt.libvirt.host [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.900 187156 DEBUG nova.virt.libvirt.host [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.903 187156 DEBUG nova.virt.libvirt.host [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.904 187156 DEBUG nova.virt.libvirt.host [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.905 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.906 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.906 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.906 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.906 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.907 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.907 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.907 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.907 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.908 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.908 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.908 187156 DEBUG nova.virt.hardware [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.913 187156 DEBUG nova.virt.libvirt.vif [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:42:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1763279634',display_name='tempest-TestNetworkBasicOps-server-1763279634',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1763279634',id=161,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkkbeQt70U16h7WfJ18TFBeq23WmjPxHMSDFLMv8iAiFoBGfuH5LpSW4Nyg4wJhsjkwbMVp/u5en890AVwFbKQPpx/7ju0KeJTr8VaGLG+ZBnyWrLll3sXzbcJEr2bkfQ==',key_name='tempest-TestNetworkBasicOps-2141533484',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-qni2oxwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:42:35Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=5b1eba03-8be0-4f33-a4d6-8c0751ddd10b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.914 187156 DEBUG nova.network.os_vif_util [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.915 187156 DEBUG nova.network.os_vif_util [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.916 187156 DEBUG nova.objects.instance [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:44 np0005539504 nova_compute[187152]: 2025-11-29 07:42:44.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.004 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <uuid>5b1eba03-8be0-4f33-a4d6-8c0751ddd10b</uuid>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <name>instance-000000a1</name>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkBasicOps-server-1763279634</nova:name>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:42:44</nova:creationTime>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        <nova:port uuid="c7326683-ae94-4050-a985-768563f895f1">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.39" ipVersion="4"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <entry name="serial">5b1eba03-8be0-4f33-a4d6-8c0751ddd10b</entry>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <entry name="uuid">5b1eba03-8be0-4f33-a4d6-8c0751ddd10b</entry>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.config"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:b2:c8:e4"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <target dev="tapc7326683-ae"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/console.log" append="off"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:42:45 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:42:45 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:42:45 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:42:45 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.005 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Preparing to wait for external event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.006 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.006 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.007 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.007 187156 DEBUG nova.virt.libvirt.vif [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:42:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1763279634',display_name='tempest-TestNetworkBasicOps-server-1763279634',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1763279634',id=161,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkkbeQt70U16h7WfJ18TFBeq23WmjPxHMSDFLMv8iAiFoBGfuH5LpSW4Nyg4wJhsjkwbMVp/u5en890AVwFbKQPpx/7ju0KeJTr8VaGLG+ZBnyWrLll3sXzbcJEr2bkfQ==',key_name='tempest-TestNetworkBasicOps-2141533484',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-qni2oxwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:42:35Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=5b1eba03-8be0-4f33-a4d6-8c0751ddd10b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.008 187156 DEBUG nova.network.os_vif_util [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.008 187156 DEBUG nova.network.os_vif_util [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.009 187156 DEBUG os_vif [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.010 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.010 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.015 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.016 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7326683-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.016 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc7326683-ae, col_values=(('external_ids', {'iface-id': 'c7326683-ae94-4050-a985-768563f895f1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:c8:e4', 'vm-uuid': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 NetworkManager[55210]: <info>  [1764402165.0209] manager: (tapc7326683-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.030 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.032 187156 INFO os_vif [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae')#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.128 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.129 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.129 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:b2:c8:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.130 187156 INFO nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Using config drive#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.710 187156 INFO nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Creating config drive at /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.config#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.716 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjk2qnwe3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.846 187156 DEBUG oslo_concurrency.processutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjk2qnwe3" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:45 np0005539504 kernel: tapc7326683-ae: entered promiscuous mode
Nov 29 02:42:45 np0005539504 NetworkManager[55210]: <info>  [1764402165.9325] manager: (tapc7326683-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Nov 29 02:42:45 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:45Z|00648|binding|INFO|Claiming lport c7326683-ae94-4050-a985-768563f895f1 for this chassis.
Nov 29 02:42:45 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:45Z|00649|binding|INFO|c7326683-ae94-4050-a985-768563f895f1: Claiming fa:16:3e:b2:c8:e4 10.100.0.39
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.936 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 systemd-udevd[245936]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:45Z|00650|binding|INFO|Setting lport c7326683-ae94-4050-a985-768563f895f1 ovn-installed in OVS
Nov 29 02:42:45 np0005539504 nova_compute[187152]: 2025-11-29 07:42:45.979 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:45 np0005539504 systemd-machined[153423]: New machine qemu-84-instance-000000a1.
Nov 29 02:42:45 np0005539504 NetworkManager[55210]: <info>  [1764402165.9907] device (tapc7326683-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:42:45 np0005539504 NetworkManager[55210]: <info>  [1764402165.9916] device (tapc7326683-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:42:46 np0005539504 systemd[1]: Started Virtual Machine qemu-84-instance-000000a1.
Nov 29 02:42:46 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:46Z|00651|binding|INFO|Setting lport c7326683-ae94-4050-a985-768563f895f1 up in Southbound
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.118 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:c8:e4 10.100.0.39'], port_security=['fa:16:3e:b2:c8:e4 10.100.0.39'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.39/28', 'neutron:device_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4871f03-7c6b-4c30-9411-b8f496fb8659', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79da6174-0485-4e06-8898-c13055f8ac79, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=c7326683-ae94-4050-a985-768563f895f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.120 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c7326683-ae94-4050-a985-768563f895f1 in datapath 6f1be974-bcaa-4b93-ab01-8adab0060f10 bound to our chassis#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.122 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6f1be974-bcaa-4b93-ab01-8adab0060f10#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.136 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[08ab1169-de3f-4912-9201-3f8b60d62064]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.138 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6f1be974-b1 in ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.141 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6f1be974-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.141 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[27b972b7-e108-49cb-b918-dc004dd270e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.144 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[99f50994-8b09-4435-8458-c18be021838f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.158 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[456986bb-78e8-4546-90a7-2d3f8aa3ca2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.176 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6f3e5f-9184-41e5-945f-80df747b295a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.218 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[01d37bab-55ae-4ea8-b143-2d4d8542d8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 NetworkManager[55210]: <info>  [1764402166.2295] manager: (tap6f1be974-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.228 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96eabf90-6110-4473-a4b0-c86f766d5163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.269 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[43e60ab0-b16c-484f-839a-9a4d0322c022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.276 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3dc717-5aa1-4c2d-ba96-ffcc38204a21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 NetworkManager[55210]: <info>  [1764402166.3052] device (tap6f1be974-b0): carrier: link connected
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.311 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[30f061ea-ecda-4fef-a032-5e28154233ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.317 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402166.3162177, 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.318 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.328 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e119444e-c8dd-48f6-9c37-ab71d16c4dbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6f1be974-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:49:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762918, 'reachable_time': 31206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245980, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.347 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2a74de-2244-4e17-9b19-ea57ad8ef160]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:49ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 762918, 'tstamp': 762918}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245981, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.367 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3c6dd4-9742-4c1a-a771-fa8105270b37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6f1be974-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:49:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762918, 'reachable_time': 31206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245982, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.393 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[215bc53c-1252-48d1-ad16-9769c1108848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.446 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a8e1f2-31fa-4c7b-bda0-4d8491a42f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.447 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f1be974-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.448 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.448 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f1be974-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:46 np0005539504 kernel: tap6f1be974-b0: entered promiscuous mode
Nov 29 02:42:46 np0005539504 NetworkManager[55210]: <info>  [1764402166.4513] manager: (tap6f1be974-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.451 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.463 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6f1be974-b0, col_values=(('external_ids', {'iface-id': '5727c765-4dbb-4890-b58e-a90c8d5f55f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:46 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:46Z|00652|binding|INFO|Releasing lport 5727c765-4dbb-4890-b58e-a90c8d5f55f2 from this chassis (sb_readonly=1)
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.465 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.467 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.468 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6f1be974-bcaa-4b93-ab01-8adab0060f10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6f1be974-bcaa-4b93-ab01-8adab0060f10.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.469 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc891d9-c9e7-4842-b983-c44957b5e667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.470 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-6f1be974-bcaa-4b93-ab01-8adab0060f10
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/6f1be974-bcaa-4b93-ab01-8adab0060f10.pid.haproxy
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 6f1be974-bcaa-4b93-ab01-8adab0060f10
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:42:46 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:46.471 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'env', 'PROCESS_TAG=haproxy-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6f1be974-bcaa-4b93-ab01-8adab0060f10.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.478 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:46 np0005539504 podman[246015]: 2025-11-29 07:42:46.830943287 +0000 UTC m=+0.027261622 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.969 187156 DEBUG nova.network.neutron [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updated VIF entry in instance network info cache for port c7326683-ae94-4050-a985-768563f895f1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:42:46 np0005539504 nova_compute[187152]: 2025-11-29 07:42:46.969 187156 DEBUG nova.network.neutron [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updating instance_info_cache with network_info: [{"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.230 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.236 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402166.3171968, 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.236 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.638 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.662 187156 DEBUG oslo_concurrency.lockutils [req-7f55527f-cf43-4c12-91f2-668014d874bc req-58375152-22be-4ea5-ba3c-822be50e9582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:42:47 np0005539504 podman[246015]: 2025-11-29 07:42:47.887912026 +0000 UTC m=+1.084230341 container create d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:47 np0005539504 nova_compute[187152]: 2025-11-29 07:42:47.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:47.990 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'name': 'tempest-TestNetworkBasicOps-server-1763279634', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a1', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:47.995 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'name': 'tempest-TestGettingAddress-server-1725294125', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:47.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:47.999 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b / tapc7326683-ae inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:42:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:47.999 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.002 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eddd5f9b-2b98-4c39-9b31-21ef5bfe464a / tapda9acc55-51 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.002 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7c51386-c0a7-49d1-9b14-8fe9aa09e2ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:47.996686', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '002693b2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': 'd5281bfdbd7d9d785f35351a4f1bf494bd42bf5ab37a3743538ad9c2118d1568'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:47.996686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '0027014e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': '71c3998d2e3d059cc4d6fa26c5adcd8bd95743061784d33395e6fa6352a00af7'}]}, 'timestamp': '2025-11-29 07:42:48.003173', '_unique_id': 'cf88ae8a625b4c3ebba1a8a19fcb82e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.007 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.007 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daf9ec73-1da8-4652-8ef0-742f31b08796', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.007555', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '0027b9f4-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '7bf6312b7065b5ec72293af155efbc7eace7c2b2e9b22d90ef2b09156c6905c2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.007555', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '0027c656-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': '564c13456ef97233b31fbe78f0304ab65a2eaea51009acf277d9b1990c1c7579'}]}, 'timestamp': '2025-11-29 07:42:48.008172', '_unique_id': '6a444d2f25604499a66a60c815305f59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.009 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.026 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.045 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/cpu volume: 12590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d4f1748-1429-4e29-95ef-cf8d47b8a669', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'timestamp': '2025-11-29T07:42:48.009964', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '002ab212-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.961368611, 'message_signature': '3d3912df0f797aa30dd9c675e8c11900dd34db96b275b2a4fc5a858fa385efc3'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12590000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'timestamp': '2025-11-29T07:42:48.009964', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '002d8f0a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.979855447, 'message_signature': '09f6ba5c2be2d7474f38e87dcce619be3d2585a44bad6ac0d2be46e3bab563c0'}]}, 'timestamp': '2025-11-29 07:42:48.046258', '_unique_id': '900a836c171f47bfb211dcae1594e28e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.048 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.079 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.080 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.105 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.read.bytes volume: 30042624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.106 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5148984f-9892-4550-a463-a4ae74abfad9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.049016', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0032d456-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '5ddf2e077a144cddb91e5fbbc9dd6c7b0495e75da25ab3f7b3821728f965b96b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.049016', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0032f134-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '4abcab4a7aa967f598c6477c3c0e757511e4d4bffd580cad69c38fddd881bf13'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30042624, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.049016', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0036bbc0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '3f0fd44ef24ef7abe71d8cef4e0886bba33082a940c54e6620af8bb20c923fef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.049016', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0036cab6-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '691741824bd4b77730fc9e2ebe0fc4e11decdc6598fdca0c6cbf2727609a5a9c'}]}, 'timestamp': '2025-11-29 07:42:48.106602', '_unique_id': '3cb44fec3d4a42ed809bd7c5ac0eaeba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.109 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.109 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78d8f572-6863-41fb-9de0-25907acd3de7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.109096', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '00373b18-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': 'e2892dc44d99494718a6113263add655fbed70a2f233ab5b2b754b1e0e5b429f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.109096', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '00374766-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': 'd41b69d81662f127cd01a3ffd8f5f53666d9c1d08a0f7839ff54015cfe52eb03'}]}, 'timestamp': '2025-11-29 07:42:48.109790', '_unique_id': '954d35f60b2f4794b9c90596e8ae915b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.111 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.111 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.111 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.111 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3a519ca-e332-4a38-84d4-72e5538aa5e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.111057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003784ce-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '5b81faa8d88f930e1dabad6199d08b26d099456dd41d595716547c9468191acc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.111057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003791da-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '1992b81b5ee6091f3ecf319f6b1acf7526f88403bc9c3116eeac65c164331e56'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.111057', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00379acc-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': 'aa22a590fc212603a5759e4738ad6a5b32b99a4ddd4cbd389228e2bf8c3c6a0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.111057', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0037a378-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '651973c97531976fe486ace82290f04aad98fc0ed37b8d5fdf7908f5f99de2cc'}]}, 'timestamp': '2025-11-29 07:42:48.112124', '_unique_id': 'fd13b1e13d8d4438ab8221dcb6ffdef2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.122 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.123 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.130 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.131 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f6979ea-1066-48b6-9d0e-5bcdbb5c817e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.113452', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00394d72-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.04818219, 'message_signature': 'dbb18ab69c200177bd3ec9e68591de0b92d16ebb060f73b77fa3b921fac0795e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.113452', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00395a1a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.04818219, 'message_signature': '713649d651bdcfe227b2e8b8e668f379bca8b8067acfc5551c6a7776e3aede24'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.113452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003a8b1a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.058073625, 'message_signature': '4cd95d2c57f4bf7c6f8913569e4097d3c0156ba7a760fbb19550238e02619698'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.113452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003a9538-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.058073625, 'message_signature': '64b0a5d77cc36974d4738109b6b4b7b4f3e4b6c46e98337bc44fd83409783bfc'}]}, 'timestamp': '2025-11-29 07:42:48.131459', '_unique_id': '081130c64d5748f79f6dda92d38c60bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.133 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.133 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>]
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.134 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.134 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.134 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.read.latency volume: 196182513 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.134 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.read.latency volume: 19847545 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc388aa1-5e1e-40c0-98ab-8d3aa8b30b01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.134246', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003b0da6-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '37700cc8f28e221474309e49a99111e511ed67659e64d2dcdac8b392a5a6cea9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.134246', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003b171a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': 'dd8e80f7d5b2afb97f43867c853be314114252db77a822b778260086a01cc646'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 196182513, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.134246', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003b1f08-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': 'cf7edeee7032b523ddd47cd2574789612f4cd30b1b3837ed300793078a140b64'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19847545, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.134246', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003b2688-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '2b07594637f3e2a8665a4872e4d6318e0bd82510141948c1da090178b92d6392'}]}, 'timestamp': '2025-11-29 07:42:48.135120', '_unique_id': '385dc429b81c43339df448e4cb956986'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.136 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.136 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.136 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/memory.usage volume: 42.45703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a733ac45-ab3b-44a1-b2bf-243980e81aa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.45703125, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'timestamp': '2025-11-29T07:42:48.136360', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '003b66a2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.979855447, 'message_signature': '253b55496743772be9c2ffcc90b98fe80507f7078c638f2ae7efe91d44334421'}]}, 'timestamp': '2025-11-29 07:42:48.136773', '_unique_id': '00471514b3b04880910ce1094dcefadc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.138 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.138 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.138 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.write.latency volume: 39883658901 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.138 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27d54442-96a2-4397-b639-1599e54bb8af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.138030', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003ba07c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': 'bc90869ca86d78131c6f549a40bad0ae30be7e8a7d81080b30d42523562598c5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.138030', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003ba8ec-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '91c1d2e556a2142883333edbd9439cea780fb758bf81566cd00f6e906119134d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39883658901, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.138030', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003bb206-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '78733801ee8aac501594a9ae6569d95d974f4a9aadbfc5be4b6ddcd0f1c318ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.138030', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003bb99a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': 'a4d56d5b14072274eaa73d277e925f8b6f1464b1b2dc708ee5fba3a86c47d06a'}]}, 'timestamp': '2025-11-29 07:42:48.138887', '_unique_id': 'd04cbafb8c1c4ed99a9b5f5d07583b61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.140 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.140 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.outgoing.bytes volume: 30188 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '787fad63-5209-4f07-b6e0-543bb17cf2d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.140059', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003bf004-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '73a21e56f42873ffbbacaeb08d0777aebace55e9c399cc5d2136ec89b21303fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30188, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.140059', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003bfbda-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': '2221fdcb04c3f013373ad7a12361ffb8d504257bf4dab04667725f2325fca258'}]}, 'timestamp': '2025-11-29 07:42:48.140609', '_unique_id': 'f1f956b6d6d84b30a5efdd6e1035ef4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.141 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.142 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.142 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.write.bytes volume: 73121792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.142 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4a018d0-03b4-490b-b364-0ea26a1ccbc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.141805', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003c33f2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '0a5782d38c71f30adf59a586f4663b2cf6bc995e978bcab158b8a58fe4594d5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.141805', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003c3bb8-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '2a4139ab502ad60c894c4fe9f2c14add83d4d70ccb6d374e543b28f8d8722a3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73121792, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.141805', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003c4342-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '246a6999aaff83feca74798f327ff7b92e3597527bada9f2d42f66743036a4f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.141805', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003c4bbc-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '64d8037b0fa45635e11a23e2566aaeef68dfc337ef0ca7665f7876e086530cd3'}]}, 'timestamp': '2025-11-29 07:42:48.142642', '_unique_id': 'cf0d162c326a406a982633b41020584b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.143 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.144 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.144 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.144 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f689f6c-8c48-4560-a3a4-746dd117fe91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.143851', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003c83c0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.04818219, 'message_signature': '1f237c8afe56032ba22bf5c7dcf29b530e52ed964a05b3f0a04cc6fac1860c50'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.143851', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003c8b68-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.04818219, 'message_signature': '4d75d541f377a67100c2cc68e3d377b6288d8b62595e3e503749463ab31f610a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.143851', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003c92f2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.058073625, 'message_signature': 'eb7a2faff47057e234c3e0433339b2f83cbcc553441823bf03843d68f2ff9846'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.143851', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003c9b62-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.058073625, 'message_signature': '45792b87d4ac313744655d411c74458ddefa73b7b916e285895e335af1a106dc'}]}, 'timestamp': '2025-11-29 07:42:48.144696', '_unique_id': '597d631c564e4f6f81a5b88ec927ca93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.145 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>]
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.146 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.146 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f110c3b6-8b9a-4645-86a8-55b45aa8eb4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.146164', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003cde06-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '30d574eb67ea02b3c274fb4bfe56c0cae5e5f8b7aa76f8133c86b8defaa55394'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.146164', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003ce73e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': '1c88cf3828ca03be6f054822da4353b3444cd0c32e444ca309201206060fe18a'}]}, 'timestamp': '2025-11-29 07:42:48.146617', '_unique_id': '21e93e0afa3044e9a820dc9d17bbb569'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.147 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3392f811-aba4-4e7f-a03d-13fb2031c860', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.147746', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003d1d12-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': 'd977710b616f3d7dd17249945e2ebcd06ed8097f4b4afd057bf7616fb39f76ef'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.147746', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003d2546-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': 'cc97d34bc75f8f79c577ccf1fed75fc8d274f5c5edcde1a59acf89ede89ccd63'}]}, 'timestamp': '2025-11-29 07:42:48.148205', '_unique_id': '2eb0a3ccaf8c4929b5cd1d2b8b86b81b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.149 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.149 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.149 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.read.requests volume: 1077 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.149 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70f2717-1fb8-4eec-b684-d82e58bdc425', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.149324', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003d5a5c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '04eb31a63ec7eb13c08d52e699d5d2553ee98ed010c663f29b3683dfbed1d2e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.149324', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003d6240-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.983725061, 'message_signature': '71a09ae4fdc69b6fade0ca917ed2b7e6b5729c1fc2c51b81e9c8805f42b60dfc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1077, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.149324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003d6b1e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': '787ea15bd3400d3d983e8bcf8ab613be249af9f4108aa3b011fb1ac9bb73ca18'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.149324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003d73a2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.016217712, 'message_signature': 'e78b3bb09bae13ad2c5e0c3921eb7f3438b47b3cec6509b2ba7e878e0fded63e'}]}, 'timestamp': '2025-11-29 07:42:48.150204', '_unique_id': '2195f23662634770a758af585474274b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>]
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1763279634>, <NovaLikeServer: tempest-TestGettingAddress-server-1725294125>]
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.152 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.152 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.incoming.bytes volume: 31747 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63e1bdc9-e6b3-4261-98f9-ed2c7cbdd3b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.152068', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003dc500-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '218e3edd7459ad2fa659138f444ab174875a5e265e560f4af23cbb88da572809'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31747, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.152068', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003dceb0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': 'a5e67d8afefca89f7889e7ede1da49560d7646fe2cc8c8a9a3e2ad334089d580'}]}, 'timestamp': '2025-11-29 07:42:48.152550', '_unique_id': 'b861817c68784e6180a1564ed3f8dccd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.153 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.incoming.packets volume: 186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b925f65-77be-45b2-84b0-b6e20db66093', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.153729', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003e06a0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '8ef7e652d2a4a14555aa35eebc9bc14a6052db7b0dabb325d8f2459349943ca4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 186, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.153729', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003e0efc-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': '99060193f4075e935d0ddb72c0d68e842d0c7e72f43793f3979d933293416229'}]}, 'timestamp': '2025-11-29 07:42:48.154202', '_unique_id': '25d552669cfc4c4a9765e50dbcf33258'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.155 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.155 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.155 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.155 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16b75457-cdd2-4a50-bb6f-486762f0f8ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-vda', 'timestamp': '2025-11-29T07:42:48.155354', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003e4660-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.04818219, 'message_signature': 'b333ee56e2797933c22238b674633cced0319d021539f570b9def7ac5802380a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-sda', 'timestamp': '2025-11-29T07:42:48.155354', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'instance-000000a1', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003e4e12-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.04818219, 'message_signature': '25372a0cdb43e90dea6e2a3bb672f5290b9d9841428185a107408b68e0420e72'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-vda', 'timestamp': '2025-11-29T07:42:48.155354', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003e5560-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.058073625, 'message_signature': '19fc915ea130fc4e31025886d2fedf6c97b40a354e53dcc8d7fd31e73a5adc87'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-sda', 'timestamp': '2025-11-29T07:42:48.155354', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'instance-0000009e', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003e5cc2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7631.058073625, 'message_signature': 'b35211a0d2fc7a6ac8d3616e9ef6d7248ff9c163555724a29b467a809cc1a35c'}]}, 'timestamp': '2025-11-29 07:42:48.156174', '_unique_id': 'b976eaf5f53b4bfc98d5f9a684d5f718'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.157 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.157 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b15fecb-b18a-4556-922b-bca535f1658b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.157483', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003e9890-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '6f832aceddc9f3feab7ff7dc3788c208b499a22b6f18e920df5d8db0b4dfa241'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.157483', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003ea1f0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': 'ab235cf04900b0dda608569ac9b62210456f7c9ea2f38d7a40219efeecb047ec'}]}, 'timestamp': '2025-11-29 07:42:48.157951', '_unique_id': 'e2bddd9885814b249319a49e2bdf8d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.159 12 DEBUG ceilometer.compute.pollsters [-] 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.159 12 DEBUG ceilometer.compute.pollsters [-] eddd5f9b-2b98-4c39-9b31-21ef5bfe464a/network.outgoing.packets volume: 189 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5872b80f-d13e-4ea5-ba56-5b993a882fbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a1-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-tapc7326683-ae', 'timestamp': '2025-11-29T07:42:48.159149', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1763279634', 'name': 'tapc7326683-ae', 'instance_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c8:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7326683-ae'}, 'message_id': '003ed9fe-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.931394996, 'message_signature': '2c36d6be948920d6e090ebfad9e8ea06c60a9d7da3d75c677f611c316c21b9bf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 189, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009e-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-tapda9acc55-51', 'timestamp': '2025-11-29T07:42:48.159149', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1725294125', 'name': 'tapda9acc55-51', 'instance_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:aa:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapda9acc55-51'}, 'message_id': '003ee53e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7630.935194058, 'message_signature': 'fdc9d9be8b79206c4f720900157e32cc68f585e2b8f6e233099a7bc64038ea63'}]}, 'timestamp': '2025-11-29 07:42:48.159677', '_unique_id': 'febe57aa6f8b4273aa113b45591b8dd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:42:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539504 systemd[1]: Started libpod-conmon-d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2.scope.
Nov 29 02:42:48 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:42:48 np0005539504 podman[246029]: 2025-11-29 07:42:48.485675364 +0000 UTC m=+0.822644553 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:42:48 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d6972f5f8833ca0206e892d7de71faa01fa6db77517474d4ecdd3ca57f93ffa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:42:48 np0005539504 podman[246030]: 2025-11-29 07:42:48.489067935 +0000 UTC m=+0.826901337 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:42:48 np0005539504 podman[246031]: 2025-11-29 07:42:48.552565988 +0000 UTC m=+0.881542152 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 02:42:49 np0005539504 nova_compute[187152]: 2025-11-29 07:42:49.112 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:42:49 np0005539504 nova_compute[187152]: 2025-11-29 07:42:49.120 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:42:49 np0005539504 podman[246015]: 2025-11-29 07:42:49.133810833 +0000 UTC m=+2.330129238 container init d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:42:49 np0005539504 podman[246015]: 2025-11-29 07:42:49.145719473 +0000 UTC m=+2.342037808 container start d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:42:49 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [NOTICE]   (246096) : New worker (246098) forked
Nov 29 02:42:49 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [NOTICE]   (246096) : Loading success.
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.319 187156 DEBUG nova.compute.manager [req-b2d4acab-5b70-437c-9889-b7b15ee1123b req-4a2c0879-6523-4e59-9bff-f730597166fb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.319 187156 DEBUG oslo_concurrency.lockutils [req-b2d4acab-5b70-437c-9889-b7b15ee1123b req-4a2c0879-6523-4e59-9bff-f730597166fb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.320 187156 DEBUG oslo_concurrency.lockutils [req-b2d4acab-5b70-437c-9889-b7b15ee1123b req-4a2c0879-6523-4e59-9bff-f730597166fb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.320 187156 DEBUG oslo_concurrency.lockutils [req-b2d4acab-5b70-437c-9889-b7b15ee1123b req-4a2c0879-6523-4e59-9bff-f730597166fb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.320 187156 DEBUG nova.compute.manager [req-b2d4acab-5b70-437c-9889-b7b15ee1123b req-4a2c0879-6523-4e59-9bff-f730597166fb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Processing event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.321 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.326 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.332 187156 INFO nova.virt.libvirt.driver [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Instance spawned successfully.#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.333 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.427 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.427 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402170.325987, 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.428 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.889 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.890 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.890 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.891 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.891 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:42:50 np0005539504 nova_compute[187152]: 2025-11-29 07:42:50.892 187156 DEBUG nova.virt.libvirt.driver [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:42:51 np0005539504 nova_compute[187152]: 2025-11-29 07:42:51.416 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:42:51 np0005539504 nova_compute[187152]: 2025-11-29 07:42:51.421 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:42:51 np0005539504 nova_compute[187152]: 2025-11-29 07:42:51.521 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:42:51 np0005539504 nova_compute[187152]: 2025-11-29 07:42:51.593 187156 INFO nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Took 16.20 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:42:51 np0005539504 nova_compute[187152]: 2025-11-29 07:42:51.594 187156 DEBUG nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:42:51 np0005539504 podman[246107]: 2025-11-29 07:42:51.723764522 +0000 UTC m=+0.065631752 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:42:51 np0005539504 podman[246108]: 2025-11-29 07:42:51.784887172 +0000 UTC m=+0.118870881 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.088 187156 INFO nova.compute.manager [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Took 17.15 seconds to build instance.#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.127 187156 DEBUG oslo_concurrency.lockutils [None req-65f3fcf3-76f3-4298-8d06-ea479aad2c0d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.401 187156 DEBUG nova.compute.manager [req-b244e392-54cf-4338-88ba-a18b1e7b159e req-4462ccbf-c93d-4497-8324-1bbb501ba570 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.401 187156 DEBUG oslo_concurrency.lockutils [req-b244e392-54cf-4338-88ba-a18b1e7b159e req-4462ccbf-c93d-4497-8324-1bbb501ba570 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.402 187156 DEBUG oslo_concurrency.lockutils [req-b244e392-54cf-4338-88ba-a18b1e7b159e req-4462ccbf-c93d-4497-8324-1bbb501ba570 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.402 187156 DEBUG oslo_concurrency.lockutils [req-b244e392-54cf-4338-88ba-a18b1e7b159e req-4462ccbf-c93d-4497-8324-1bbb501ba570 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.402 187156 DEBUG nova.compute.manager [req-b244e392-54cf-4338-88ba-a18b1e7b159e req-4462ccbf-c93d-4497-8324-1bbb501ba570 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] No waiting events found dispatching network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.402 187156 WARNING nova.compute.manager [req-b244e392-54cf-4338-88ba-a18b1e7b159e req-4462ccbf-c93d-4497-8324-1bbb501ba570 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received unexpected event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:42:52 np0005539504 nova_compute[187152]: 2025-11-29 07:42:52.640 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.705 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.706 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.707 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.708 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.708 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.728 187156 INFO nova.compute.manager [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Terminating instance#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.743 187156 DEBUG nova.compute.manager [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:42:54 np0005539504 kernel: tapda9acc55-51 (unregistering): left promiscuous mode
Nov 29 02:42:54 np0005539504 NetworkManager[55210]: <info>  [1764402174.7743] device (tapda9acc55-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:42:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:54Z|00653|binding|INFO|Releasing lport da9acc55-51b1-44ba-b281-8871c07a7c33 from this chassis (sb_readonly=0)
Nov 29 02:42:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:54Z|00654|binding|INFO|Setting lport da9acc55-51b1-44ba-b281-8871c07a7c33 down in Southbound
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.793 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:42:54Z|00655|binding|INFO|Removing iface tapda9acc55-51 ovn-installed in OVS
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.799 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:54.807 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:aa:27 10.100.0.6 2001:db8:0:1:f816:3eff:fe27:aa27 2001:db8::f816:3eff:fe27:aa27'], port_security=['fa:16:3e:27:aa:27 10.100.0.6 2001:db8:0:1:f816:3eff:fe27:aa27 2001:db8::f816:3eff:fe27:aa27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8:0:1:f816:3eff:fe27:aa27/64 2001:db8::f816:3eff:fe27:aa27/64', 'neutron:device_id': 'eddd5f9b-2b98-4c39-9b31-21ef5bfe464a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e2af89ad-a80e-4dc1-aa45-ab6ce3534b4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=da9acc55-51b1-44ba-b281-8871c07a7c33) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:42:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:54.809 104164 INFO neutron.agent.ovn.metadata.agent [-] Port da9acc55-51b1-44ba-b281-8871c07a7c33 in datapath 600edac6-24aa-414f-b977-07c2890470f1 unbound from our chassis#033[00m
Nov 29 02:42:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:54.811 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:42:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:54.815 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3c12d1ad-8115-46b3-bcd1-925ee3d3bd48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:54 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:54.816 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 namespace which is not needed anymore#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.818 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:54 np0005539504 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Nov 29 02:42:54 np0005539504 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000009e.scope: Consumed 18.014s CPU time.
Nov 29 02:42:54 np0005539504 systemd-machined[153423]: Machine qemu-83-instance-0000009e terminated.
Nov 29 02:42:54 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [NOTICE]   (245383) : haproxy version is 2.8.14-c23fe91
Nov 29 02:42:54 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [NOTICE]   (245383) : path to executable is /usr/sbin/haproxy
Nov 29 02:42:54 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [WARNING]  (245383) : Exiting Master process...
Nov 29 02:42:54 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [WARNING]  (245383) : Exiting Master process...
Nov 29 02:42:54 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [ALERT]    (245383) : Current worker (245385) exited with code 143 (Terminated)
Nov 29 02:42:54 np0005539504 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[245379]: [WARNING]  (245383) : All workers exited. Exiting... (0)
Nov 29 02:42:54 np0005539504 systemd[1]: libpod-3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f.scope: Deactivated successfully.
Nov 29 02:42:54 np0005539504 podman[246178]: 2025-11-29 07:42:54.966273598 +0000 UTC m=+0.049764465 container died 3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.968 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:54 np0005539504 nova_compute[187152]: 2025-11-29 07:42:54.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:54 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:42:54 np0005539504 systemd[1]: var-lib-containers-storage-overlay-b2cfba1bf143536a5018c5f4c5e2519be9e0036642477f233cae602f240298f4-merged.mount: Deactivated successfully.
Nov 29 02:42:55 np0005539504 podman[246178]: 2025-11-29 07:42:55.007934976 +0000 UTC m=+0.091425843 container cleanup 3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:42:55 np0005539504 systemd[1]: libpod-conmon-3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f.scope: Deactivated successfully.
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.020 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.033 187156 INFO nova.virt.libvirt.driver [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Instance destroyed successfully.#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.034 187156 DEBUG nova.objects.instance [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid eddd5f9b-2b98-4c39-9b31-21ef5bfe464a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.047 187156 DEBUG nova.virt.libvirt.vif [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:40:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1725294125',display_name='tempest-TestGettingAddress-server-1725294125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1725294125',id=158,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3d0ZsvCuIHNmZM7lmf14lwcU9LYA+YzS+DsUoU/RUNt3FNYs43WlwoA0reTsUUFEVQa4lagWavf3wAARjW0IrVdX6QLhMZ1dtoKB8yeTuH2S9PjazhePg7oe9bdCIqjQ==',key_name='tempest-TestGettingAddress-1639164498',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-7wudwqxz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=eddd5f9b-2b98-4c39-9b31-21ef5bfe464a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.047 187156 DEBUG nova.network.os_vif_util [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.048 187156 DEBUG nova.network.os_vif_util [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.049 187156 DEBUG os_vif [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.052 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.052 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda9acc55-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.062 187156 INFO os_vif [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:aa:27,bridge_name='br-int',has_traffic_filtering=True,id=da9acc55-51b1-44ba-b281-8871c07a7c33,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda9acc55-51')#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.063 187156 INFO nova.virt.libvirt.driver [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Deleting instance files /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a_del#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.064 187156 INFO nova.virt.libvirt.driver [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Deletion of /var/lib/nova/instances/eddd5f9b-2b98-4c39-9b31-21ef5bfe464a_del complete#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.071 187156 DEBUG nova.compute.manager [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-changed-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.071 187156 DEBUG nova.compute.manager [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Refreshing instance network info cache due to event network-changed-da9acc55-51b1-44ba-b281-8871c07a7c33. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.071 187156 DEBUG oslo_concurrency.lockutils [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.072 187156 DEBUG oslo_concurrency.lockutils [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.073 187156 DEBUG nova.network.neutron [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Refreshing network info cache for port da9acc55-51b1-44ba-b281-8871c07a7c33 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:42:55 np0005539504 podman[246220]: 2025-11-29 07:42:55.097307704 +0000 UTC m=+0.061220193 container remove 3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.104 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba171d3-7283-40d6-9bfa-1049ec145504]: (4, ('Sat Nov 29 07:42:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 (3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f)\n3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f\nSat Nov 29 07:42:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 (3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f)\n3e88b342395206388aec13fddda399a883363ed1bee750690fec8f285231a99f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.106 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7b691154-f20c-47e8-9776-e44fb7754159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.107 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap600edac6-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.110 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:55 np0005539504 kernel: tap600edac6-20: left promiscuous mode
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.123 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.127 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[82946cd0-0413-4612-a23e-169dd25b0a37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.133 187156 INFO nova.compute.manager [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.134 187156 DEBUG oslo.service.loopingcall [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.134 187156 DEBUG nova.compute.manager [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:42:55 np0005539504 nova_compute[187152]: 2025-11-29 07:42:55.135 187156 DEBUG nova.network.neutron [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.144 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3bbfb4-e074-4ee2-bfee-aae78e19c14b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.146 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5312d2-92ab-4d02-a593-94878afa9d7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.167 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb1c321-f441-4ae1-8a17-63a46aff9545]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 752854, 'reachable_time': 39204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246236, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.170 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:42:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:42:55.171 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f3403f-efd1-4dde-b85a-1126132e8969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:55 np0005539504 systemd[1]: run-netns-ovnmeta\x2d600edac6\x2d24aa\x2d414f\x2db977\x2d07c2890470f1.mount: Deactivated successfully.
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.486 187156 DEBUG nova.compute.manager [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-vif-unplugged-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.487 187156 DEBUG oslo_concurrency.lockutils [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.487 187156 DEBUG oslo_concurrency.lockutils [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.488 187156 DEBUG oslo_concurrency.lockutils [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.488 187156 DEBUG nova.compute.manager [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] No waiting events found dispatching network-vif-unplugged-da9acc55-51b1-44ba-b281-8871c07a7c33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.489 187156 DEBUG nova.compute.manager [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-vif-unplugged-da9acc55-51b1-44ba-b281-8871c07a7c33 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.490 187156 DEBUG nova.compute.manager [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.490 187156 DEBUG oslo_concurrency.lockutils [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.491 187156 DEBUG oslo_concurrency.lockutils [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.491 187156 DEBUG oslo_concurrency.lockutils [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.492 187156 DEBUG nova.compute.manager [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] No waiting events found dispatching network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:42:56 np0005539504 nova_compute[187152]: 2025-11-29 07:42:56.492 187156 WARNING nova.compute.manager [req-9ea914f0-6229-41ab-8ef8-48bf1929d8f9 req-759a9164-431e-43ad-bf43-cf1bcc0b8a11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received unexpected event network-vif-plugged-da9acc55-51b1-44ba-b281-8871c07a7c33 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:42:57 np0005539504 nova_compute[187152]: 2025-11-29 07:42:57.642 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:57 np0005539504 nova_compute[187152]: 2025-11-29 07:42:57.981 187156 DEBUG nova.network.neutron [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.171 187156 DEBUG nova.compute.manager [req-d2edaa7e-52c3-4150-9e71-c2edd065f9f5 req-fc2798fe-7afa-4189-84cf-c3b02a91b1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Received event network-vif-deleted-da9acc55-51b1-44ba-b281-8871c07a7c33 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.172 187156 INFO nova.compute.manager [req-d2edaa7e-52c3-4150-9e71-c2edd065f9f5 req-fc2798fe-7afa-4189-84cf-c3b02a91b1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Neutron deleted interface da9acc55-51b1-44ba-b281-8871c07a7c33; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.173 187156 DEBUG nova.network.neutron [req-d2edaa7e-52c3-4150-9e71-c2edd065f9f5 req-fc2798fe-7afa-4189-84cf-c3b02a91b1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.175 187156 INFO nova.compute.manager [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Took 3.04 seconds to deallocate network for instance.#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.237 187156 DEBUG nova.compute.manager [req-d2edaa7e-52c3-4150-9e71-c2edd065f9f5 req-fc2798fe-7afa-4189-84cf-c3b02a91b1b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Detach interface failed, port_id=da9acc55-51b1-44ba-b281-8871c07a7c33, reason: Instance eddd5f9b-2b98-4c39-9b31-21ef5bfe464a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.766 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.767 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.834 187156 DEBUG nova.compute.provider_tree [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:42:58 np0005539504 nova_compute[187152]: 2025-11-29 07:42:58.924 187156 DEBUG nova.scheduler.client.report [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:42:59 np0005539504 nova_compute[187152]: 2025-11-29 07:42:59.100 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:59 np0005539504 nova_compute[187152]: 2025-11-29 07:42:59.182 187156 INFO nova.scheduler.client.report [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance eddd5f9b-2b98-4c39-9b31-21ef5bfe464a#033[00m
Nov 29 02:42:59 np0005539504 nova_compute[187152]: 2025-11-29 07:42:59.615 187156 DEBUG nova.network.neutron [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updated VIF entry in instance network info cache for port da9acc55-51b1-44ba-b281-8871c07a7c33. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:42:59 np0005539504 nova_compute[187152]: 2025-11-29 07:42:59.616 187156 DEBUG nova.network.neutron [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Updating instance_info_cache with network_info: [{"id": "da9acc55-51b1-44ba-b281-8871c07a7c33", "address": "fa:16:3e:27:aa:27", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:aa27", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda9acc55-51", "ovs_interfaceid": "da9acc55-51b1-44ba-b281-8871c07a7c33", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:43:00 np0005539504 nova_compute[187152]: 2025-11-29 07:43:00.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:00 np0005539504 podman[246237]: 2025-11-29 07:43:00.753103429 +0000 UTC m=+0.087532049 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:43:01 np0005539504 nova_compute[187152]: 2025-11-29 07:43:01.862 187156 DEBUG oslo_concurrency.lockutils [req-1637319e-0f45-4647-ab20-f3f2723af62d req-7120ebb1-61ef-4028-81ee-ada72beb9d70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:43:01 np0005539504 nova_compute[187152]: 2025-11-29 07:43:01.868 187156 DEBUG oslo_concurrency.lockutils [None req-7e1591fc-25e3-4227-aceb-9b191add530b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "eddd5f9b-2b98-4c39-9b31-21ef5bfe464a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:02 np0005539504 nova_compute[187152]: 2025-11-29 07:43:02.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:05 np0005539504 nova_compute[187152]: 2025-11-29 07:43:05.061 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:43:05Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b2:c8:e4 10.100.0.39
Nov 29 02:43:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:43:05Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b2:c8:e4 10.100.0.39
Nov 29 02:43:06 np0005539504 podman[246283]: 2025-11-29 07:43:06.749851682 +0000 UTC m=+0.083237824 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:43:07 np0005539504 nova_compute[187152]: 2025-11-29 07:43:07.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:07.796 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:43:07 np0005539504 nova_compute[187152]: 2025-11-29 07:43:07.797 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:07 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:07.798 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:43:10 np0005539504 nova_compute[187152]: 2025-11-29 07:43:10.032 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402175.030515, eddd5f9b-2b98-4c39-9b31-21ef5bfe464a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:43:10 np0005539504 nova_compute[187152]: 2025-11-29 07:43:10.033 187156 INFO nova.compute.manager [-] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:43:10 np0005539504 nova_compute[187152]: 2025-11-29 07:43:10.057 187156 DEBUG nova.compute.manager [None req-e04341d7-950e-42f7-83ad-0489b519b536 - - - - - -] [instance: eddd5f9b-2b98-4c39-9b31-21ef5bfe464a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:43:10 np0005539504 nova_compute[187152]: 2025-11-29 07:43:10.066 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:10 np0005539504 ovn_controller[95182]: 2025-11-29T07:43:10Z|00656|binding|INFO|Releasing lport 5727c765-4dbb-4890-b58e-a90c8d5f55f2 from this chassis (sb_readonly=0)
Nov 29 02:43:11 np0005539504 nova_compute[187152]: 2025-11-29 07:43:11.003 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:11 np0005539504 ovn_controller[95182]: 2025-11-29T07:43:11Z|00657|binding|INFO|Releasing lport 5727c765-4dbb-4890-b58e-a90c8d5f55f2 from this chassis (sb_readonly=0)
Nov 29 02:43:11 np0005539504 nova_compute[187152]: 2025-11-29 07:43:11.207 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:11 np0005539504 nova_compute[187152]: 2025-11-29 07:43:11.327 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:11 np0005539504 nova_compute[187152]: 2025-11-29 07:43:11.327 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:43:11 np0005539504 nova_compute[187152]: 2025-11-29 07:43:11.344 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:43:12 np0005539504 nova_compute[187152]: 2025-11-29 07:43:12.647 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:12.801 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:43:12 np0005539504 nova_compute[187152]: 2025-11-29 07:43:12.954 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:14 np0005539504 podman[201359]: time="2025-11-29T07:43:14Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 02:43:14 np0005539504 podman[201359]: @ - - [29/Nov/2025:07:43:14 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 24107 "" "Go-http-client/1.1"
Nov 29 02:43:15 np0005539504 nova_compute[187152]: 2025-11-29 07:43:15.069 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:15 np0005539504 nova_compute[187152]: 2025-11-29 07:43:15.076 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:15 np0005539504 nova_compute[187152]: 2025-11-29 07:43:15.956 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:17 np0005539504 nova_compute[187152]: 2025-11-29 07:43:17.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:18 np0005539504 podman[246311]: 2025-11-29 07:43:18.732533027 +0000 UTC m=+0.065224650 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:43:18 np0005539504 podman[246312]: 2025-11-29 07:43:18.742259008 +0000 UTC m=+0.066425393 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 29 02:43:18 np0005539504 podman[246310]: 2025-11-29 07:43:18.755456663 +0000 UTC m=+0.092525173 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:43:20 np0005539504 nova_compute[187152]: 2025-11-29 07:43:20.074 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:20 np0005539504 nova_compute[187152]: 2025-11-29 07:43:20.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:22 np0005539504 nova_compute[187152]: 2025-11-29 07:43:22.670 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:22 np0005539504 podman[246375]: 2025-11-29 07:43:22.72366055 +0000 UTC m=+0.049063017 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:43:22 np0005539504 podman[246376]: 2025-11-29 07:43:22.796243857 +0000 UTC m=+0.114936725 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:43:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:23.481 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:23.481 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:23.482 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:24.697 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8::f816:3eff:fec8:31c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4e85a268-4b8a-4015-a903-2252d696f8f5) old=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:43:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:24.698 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4e85a268-4b8a-4015-a903-2252d696f8f5 in datapath e23e9510-a780-4254-b7f0-36040139e7db updated#033[00m
Nov 29 02:43:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:24.699 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:43:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:24.835 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50b1b2b7-29e4-4470-b468-a9ca4718d358]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:43:25 np0005539504 nova_compute[187152]: 2025-11-29 07:43:25.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:26 np0005539504 nova_compute[187152]: 2025-11-29 07:43:26.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:26 np0005539504 nova_compute[187152]: 2025-11-29 07:43:26.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:43:27 np0005539504 nova_compute[187152]: 2025-11-29 07:43:27.672 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:28.957 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8:0:1:f816:3eff:fec8:31c8 2001:db8::f816:3eff:fec8:31c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fec8:31c8/64 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4e85a268-4b8a-4015-a903-2252d696f8f5) old=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8::f816:3eff:fec8:31c8'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:43:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:28.958 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4e85a268-4b8a-4015-a903-2252d696f8f5 in datapath e23e9510-a780-4254-b7f0-36040139e7db updated#033[00m
Nov 29 02:43:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:28.960 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:43:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:43:28.961 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7c9cf9-0c74-4bb2-b54a-ea642bf35d5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:43:29 np0005539504 nova_compute[187152]: 2025-11-29 07:43:29.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:29 np0005539504 nova_compute[187152]: 2025-11-29 07:43:29.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:29 np0005539504 nova_compute[187152]: 2025-11-29 07:43:29.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:29 np0005539504 nova_compute[187152]: 2025-11-29 07:43:29.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:29 np0005539504 nova_compute[187152]: 2025-11-29 07:43:29.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:29 np0005539504 nova_compute[187152]: 2025-11-29 07:43:29.962 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.029 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.083 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.089 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.090 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.150 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.286 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.287 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5543MB free_disk=72.97756958007812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.288 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.288 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.349 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.350 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.350 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.389 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.408 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.427 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:43:30 np0005539504 nova_compute[187152]: 2025-11-29 07:43:30.427 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.428 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.429 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.429 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.665 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.665 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.665 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:43:31 np0005539504 nova_compute[187152]: 2025-11-29 07:43:31.665 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:43:31 np0005539504 podman[246434]: 2025-11-29 07:43:31.74158787 +0000 UTC m=+0.085763572 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:43:32 np0005539504 nova_compute[187152]: 2025-11-29 07:43:32.675 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:34 np0005539504 nova_compute[187152]: 2025-11-29 07:43:34.059 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updating instance_info_cache with network_info: [{"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:43:34 np0005539504 nova_compute[187152]: 2025-11-29 07:43:34.108 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:43:34 np0005539504 nova_compute[187152]: 2025-11-29 07:43:34.109 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:43:34 np0005539504 nova_compute[187152]: 2025-11-29 07:43:34.109 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:35 np0005539504 nova_compute[187152]: 2025-11-29 07:43:35.088 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:37 np0005539504 nova_compute[187152]: 2025-11-29 07:43:37.676 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:37 np0005539504 podman[246455]: 2025-11-29 07:43:37.746664386 +0000 UTC m=+0.082207936 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:43:40 np0005539504 nova_compute[187152]: 2025-11-29 07:43:40.092 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:41 np0005539504 nova_compute[187152]: 2025-11-29 07:43:41.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:42 np0005539504 nova_compute[187152]: 2025-11-29 07:43:42.678 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:45 np0005539504 nova_compute[187152]: 2025-11-29 07:43:45.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:47 np0005539504 nova_compute[187152]: 2025-11-29 07:43:47.680 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:49 np0005539504 podman[246477]: 2025-11-29 07:43:49.802608798 +0000 UTC m=+0.136141143 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:43:49 np0005539504 podman[246478]: 2025-11-29 07:43:49.806409461 +0000 UTC m=+0.140412349 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:43:49 np0005539504 podman[246479]: 2025-11-29 07:43:49.810961753 +0000 UTC m=+0.134823108 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 29 02:43:50 np0005539504 nova_compute[187152]: 2025-11-29 07:43:50.101 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:51 np0005539504 nova_compute[187152]: 2025-11-29 07:43:51.052 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:51 np0005539504 nova_compute[187152]: 2025-11-29 07:43:51.078 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Triggering sync for uuid 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:43:51 np0005539504 nova_compute[187152]: 2025-11-29 07:43:51.079 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:51 np0005539504 nova_compute[187152]: 2025-11-29 07:43:51.080 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:51 np0005539504 nova_compute[187152]: 2025-11-29 07:43:51.102 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:52 np0005539504 nova_compute[187152]: 2025-11-29 07:43:52.685 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:53 np0005539504 podman[246542]: 2025-11-29 07:43:53.722944701 +0000 UTC m=+0.056611300 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:43:53 np0005539504 podman[246543]: 2025-11-29 07:43:53.80524901 +0000 UTC m=+0.130781091 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:43:55 np0005539504 nova_compute[187152]: 2025-11-29 07:43:55.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:57 np0005539504 nova_compute[187152]: 2025-11-29 07:43:57.735 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:00 np0005539504 nova_compute[187152]: 2025-11-29 07:44:00.178 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:02 np0005539504 podman[246593]: 2025-11-29 07:44:02.716251341 +0000 UTC m=+0.059905979 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:44:02 np0005539504 nova_compute[187152]: 2025-11-29 07:44:02.739 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.818 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.818 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.819 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.819 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.820 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.837 187156 INFO nova.compute.manager [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Terminating instance#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.850 187156 DEBUG nova.compute.manager [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:44:03 np0005539504 kernel: tapc7326683-ae (unregistering): left promiscuous mode
Nov 29 02:44:03 np0005539504 NetworkManager[55210]: <info>  [1764402243.8801] device (tapc7326683-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.885 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:03Z|00658|binding|INFO|Releasing lport c7326683-ae94-4050-a985-768563f895f1 from this chassis (sb_readonly=0)
Nov 29 02:44:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:03Z|00659|binding|INFO|Setting lport c7326683-ae94-4050-a985-768563f895f1 down in Southbound
Nov 29 02:44:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:03Z|00660|binding|INFO|Removing iface tapc7326683-ae ovn-installed in OVS
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.889 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:03.895 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:c8:e4 10.100.0.39'], port_security=['fa:16:3e:b2:c8:e4 10.100.0.39'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.39/28', 'neutron:device_id': '5b1eba03-8be0-4f33-a4d6-8c0751ddd10b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4871f03-7c6b-4c30-9411-b8f496fb8659', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79da6174-0485-4e06-8898-c13055f8ac79, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=c7326683-ae94-4050-a985-768563f895f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:03.897 104164 INFO neutron.agent.ovn.metadata.agent [-] Port c7326683-ae94-4050-a985-768563f895f1 in datapath 6f1be974-bcaa-4b93-ab01-8adab0060f10 unbound from our chassis#033[00m
Nov 29 02:44:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:03.899 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6f1be974-bcaa-4b93-ab01-8adab0060f10, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:44:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:03.900 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1b074ea5-10e4-48cd-a8e0-afd82e151192]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:03.901 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 namespace which is not needed anymore#033[00m
Nov 29 02:44:03 np0005539504 nova_compute[187152]: 2025-11-29 07:44:03.910 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:03 np0005539504 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Nov 29 02:44:03 np0005539504 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000a1.scope: Consumed 15.641s CPU time.
Nov 29 02:44:03 np0005539504 systemd-machined[153423]: Machine qemu-84-instance-000000a1 terminated.
Nov 29 02:44:04 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [NOTICE]   (246096) : haproxy version is 2.8.14-c23fe91
Nov 29 02:44:04 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [NOTICE]   (246096) : path to executable is /usr/sbin/haproxy
Nov 29 02:44:04 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [WARNING]  (246096) : Exiting Master process...
Nov 29 02:44:04 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [ALERT]    (246096) : Current worker (246098) exited with code 143 (Terminated)
Nov 29 02:44:04 np0005539504 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[246070]: [WARNING]  (246096) : All workers exited. Exiting... (0)
Nov 29 02:44:04 np0005539504 systemd[1]: libpod-d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2.scope: Deactivated successfully.
Nov 29 02:44:04 np0005539504 podman[246638]: 2025-11-29 07:44:04.036958805 +0000 UTC m=+0.044656449 container died d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:44:04 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2-userdata-shm.mount: Deactivated successfully.
Nov 29 02:44:04 np0005539504 systemd[1]: var-lib-containers-storage-overlay-5d6972f5f8833ca0206e892d7de71faa01fa6db77517474d4ecdd3ca57f93ffa-merged.mount: Deactivated successfully.
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.076 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 podman[246638]: 2025-11-29 07:44:04.079192228 +0000 UTC m=+0.086889872 container cleanup d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 systemd[1]: libpod-conmon-d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2.scope: Deactivated successfully.
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.118 187156 INFO nova.virt.libvirt.driver [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Instance destroyed successfully.#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.120 187156 DEBUG nova.objects.instance [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.134 187156 DEBUG nova.virt.libvirt.vif [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:42:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1763279634',display_name='tempest-TestNetworkBasicOps-server-1763279634',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1763279634',id=161,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkkbeQt70U16h7WfJ18TFBeq23WmjPxHMSDFLMv8iAiFoBGfuH5LpSW4Nyg4wJhsjkwbMVp/u5en890AVwFbKQPpx/7ju0KeJTr8VaGLG+ZBnyWrLll3sXzbcJEr2bkfQ==',key_name='tempest-TestNetworkBasicOps-2141533484',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:42:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-qni2oxwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:42:51Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=5b1eba03-8be0-4f33-a4d6-8c0751ddd10b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.135 187156 DEBUG nova.network.os_vif_util [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c7326683-ae94-4050-a985-768563f895f1", "address": "fa:16:3e:b2:c8:e4", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7326683-ae", "ovs_interfaceid": "c7326683-ae94-4050-a985-768563f895f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.136 187156 DEBUG nova.network.os_vif_util [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.136 187156 DEBUG os_vif [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.138 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.139 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7326683-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.140 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.142 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.145 187156 INFO os_vif [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:c8:e4,bridge_name='br-int',has_traffic_filtering=True,id=c7326683-ae94-4050-a985-768563f895f1,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7326683-ae')#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.145 187156 INFO nova.virt.libvirt.driver [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Deleting instance files /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b_del#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.146 187156 INFO nova.virt.libvirt.driver [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Deletion of /var/lib/nova/instances/5b1eba03-8be0-4f33-a4d6-8c0751ddd10b_del complete#033[00m
Nov 29 02:44:04 np0005539504 podman[246676]: 2025-11-29 07:44:04.147772148 +0000 UTC m=+0.046028426 container remove d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.152 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[26b7d940-41db-4191-a2f3-a4c029c51ed7]: (4, ('Sat Nov 29 07:44:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 (d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2)\nd1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2\nSat Nov 29 07:44:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 (d1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2)\nd1cf5bd92cbc57facddf31d268ecd94d6e8afb1fa3dc80bf2617bc742256d3f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.154 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[610f09dd-29c9-444e-b3e6-209978a4c6a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.154 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f1be974-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 kernel: tap6f1be974-b0: left promiscuous mode
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.168 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.171 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[96fe4b38-9b88-44c2-b422-47ce46b15a0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.190 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dbded3ae-ff2a-4643-b5a1-97a4ddccd44e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.191 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b5932a9b-600e-4ee0-b66e-ba65a112d331]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.210 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6860dc-e361-44f5-b4c5-e1397a17e044]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762909, 'reachable_time': 23462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246700, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.215 187156 INFO nova.compute.manager [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:44:04 np0005539504 systemd[1]: run-netns-ovnmeta\x2d6f1be974\x2dbcaa\x2d4b93\x2dab01\x2d8adab0060f10.mount: Deactivated successfully.
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.214 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.215 187156 DEBUG oslo.service.loopingcall [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:44:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:04.215 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[de5386ae-1583-4dad-ab45-f0a30c880c97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.216 187156 DEBUG nova.compute.manager [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:44:04 np0005539504 nova_compute[187152]: 2025-11-29 07:44:04.216 187156 DEBUG nova.network.neutron [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:44:05 np0005539504 nova_compute[187152]: 2025-11-29 07:44:05.132 187156 DEBUG nova.compute.manager [req-d22d040a-04a6-4ed8-8c60-becc9a4e103e req-37f40f97-72c3-4c5e-ab8c-22fced792cbc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-vif-unplugged-c7326683-ae94-4050-a985-768563f895f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:05 np0005539504 nova_compute[187152]: 2025-11-29 07:44:05.133 187156 DEBUG oslo_concurrency.lockutils [req-d22d040a-04a6-4ed8-8c60-becc9a4e103e req-37f40f97-72c3-4c5e-ab8c-22fced792cbc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:05 np0005539504 nova_compute[187152]: 2025-11-29 07:44:05.133 187156 DEBUG oslo_concurrency.lockutils [req-d22d040a-04a6-4ed8-8c60-becc9a4e103e req-37f40f97-72c3-4c5e-ab8c-22fced792cbc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:05 np0005539504 nova_compute[187152]: 2025-11-29 07:44:05.134 187156 DEBUG oslo_concurrency.lockutils [req-d22d040a-04a6-4ed8-8c60-becc9a4e103e req-37f40f97-72c3-4c5e-ab8c-22fced792cbc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:05 np0005539504 nova_compute[187152]: 2025-11-29 07:44:05.134 187156 DEBUG nova.compute.manager [req-d22d040a-04a6-4ed8-8c60-becc9a4e103e req-37f40f97-72c3-4c5e-ab8c-22fced792cbc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] No waiting events found dispatching network-vif-unplugged-c7326683-ae94-4050-a985-768563f895f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:05 np0005539504 nova_compute[187152]: 2025-11-29 07:44:05.134 187156 DEBUG nova.compute.manager [req-d22d040a-04a6-4ed8-8c60-becc9a4e103e req-37f40f97-72c3-4c5e-ab8c-22fced792cbc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-vif-unplugged-c7326683-ae94-4050-a985-768563f895f1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.591 187156 DEBUG nova.compute.manager [req-9e856679-45ef-4eed-8807-c90515eb15b3 req-bf7ed9af-054e-4c77-979a-d40b2cecea34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.592 187156 DEBUG oslo_concurrency.lockutils [req-9e856679-45ef-4eed-8807-c90515eb15b3 req-bf7ed9af-054e-4c77-979a-d40b2cecea34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.592 187156 DEBUG oslo_concurrency.lockutils [req-9e856679-45ef-4eed-8807-c90515eb15b3 req-bf7ed9af-054e-4c77-979a-d40b2cecea34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.593 187156 DEBUG oslo_concurrency.lockutils [req-9e856679-45ef-4eed-8807-c90515eb15b3 req-bf7ed9af-054e-4c77-979a-d40b2cecea34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.593 187156 DEBUG nova.compute.manager [req-9e856679-45ef-4eed-8807-c90515eb15b3 req-bf7ed9af-054e-4c77-979a-d40b2cecea34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] No waiting events found dispatching network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.593 187156 WARNING nova.compute.manager [req-9e856679-45ef-4eed-8807-c90515eb15b3 req-bf7ed9af-054e-4c77-979a-d40b2cecea34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received unexpected event network-vif-plugged-c7326683-ae94-4050-a985-768563f895f1 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:44:07 np0005539504 nova_compute[187152]: 2025-11-29 07:44:07.743 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:08 np0005539504 nova_compute[187152]: 2025-11-29 07:44:08.590 187156 DEBUG nova.network.neutron [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:08 np0005539504 podman[246707]: 2025-11-29 07:44:08.736527105 +0000 UTC m=+0.065269233 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:44:08 np0005539504 nova_compute[187152]: 2025-11-29 07:44:08.955 187156 DEBUG nova.compute.manager [req-cf0322cb-431e-445f-9dec-8825f1710720 req-ab171954-db51-49cb-84ff-3b122479528a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Received event network-vif-deleted-c7326683-ae94-4050-a985-768563f895f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:08 np0005539504 nova_compute[187152]: 2025-11-29 07:44:08.956 187156 INFO nova.compute.manager [req-cf0322cb-431e-445f-9dec-8825f1710720 req-ab171954-db51-49cb-84ff-3b122479528a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Neutron deleted interface c7326683-ae94-4050-a985-768563f895f1; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:44:08 np0005539504 nova_compute[187152]: 2025-11-29 07:44:08.956 187156 DEBUG nova.network.neutron [req-cf0322cb-431e-445f-9dec-8825f1710720 req-ab171954-db51-49cb-84ff-3b122479528a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:08 np0005539504 nova_compute[187152]: 2025-11-29 07:44:08.964 187156 INFO nova.compute.manager [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Took 4.75 seconds to deallocate network for instance.#033[00m
Nov 29 02:44:08 np0005539504 nova_compute[187152]: 2025-11-29 07:44:08.980 187156 DEBUG nova.compute.manager [req-cf0322cb-431e-445f-9dec-8825f1710720 req-ab171954-db51-49cb-84ff-3b122479528a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Detach interface failed, port_id=c7326683-ae94-4050-a985-768563f895f1, reason: Instance 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.142 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.157 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.158 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.221 187156 DEBUG nova.compute.provider_tree [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.236 187156 DEBUG nova.scheduler.client.report [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.264 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.293 187156 INFO nova.scheduler.client.report [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b#033[00m
Nov 29 02:44:09 np0005539504 nova_compute[187152]: 2025-11-29 07:44:09.379 187156 DEBUG oslo_concurrency.lockutils [None req-bc313a9f-51dc-410d-81ec-4158d6601217 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "5b1eba03-8be0-4f33-a4d6-8c0751ddd10b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.467 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.468 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.489 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.669 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.669 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.677 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.678 187156 INFO nova.compute.claims [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.793 187156 DEBUG nova.compute.provider_tree [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.808 187156 DEBUG nova.scheduler.client.report [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.829 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.830 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.893 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.894 187156 DEBUG nova.network.neutron [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.925 187156 INFO nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:44:11 np0005539504 nova_compute[187152]: 2025-11-29 07:44:11.947 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.064 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.066 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.066 187156 INFO nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Creating image(s)#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.067 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.068 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.069 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.091 187156 DEBUG nova.policy [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.096 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.160 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.162 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.163 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.186 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.261 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.263 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.303 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.305 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.305 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.368 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.369 187156 DEBUG nova.virt.disk.api [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.369 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.428 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.429 187156 DEBUG nova.virt.disk.api [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.429 187156 DEBUG nova.objects.instance [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid b81a6289-ddd4-4126-9a4c-5d697ddb4da1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.452 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.453 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Ensure instance console log exists: /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.453 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.454 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.454 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.784 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:12.852 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.853 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:12 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:12.854 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:44:12 np0005539504 nova_compute[187152]: 2025-11-29 07:44:12.977 187156 DEBUG nova.network.neutron [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Successfully created port: 341d1042-c21e-4fe5-aacf-9dcf7484f194 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:44:13 np0005539504 nova_compute[187152]: 2025-11-29 07:44:13.110 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539504 nova_compute[187152]: 2025-11-29 07:44:13.964 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.145 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.297 187156 DEBUG nova.network.neutron [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Successfully updated port: 341d1042-c21e-4fe5-aacf-9dcf7484f194 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.328 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.329 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.329 187156 DEBUG nova.network.neutron [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.452 187156 DEBUG nova.compute.manager [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-changed-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.452 187156 DEBUG nova.compute.manager [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Refreshing instance network info cache due to event network-changed-341d1042-c21e-4fe5-aacf-9dcf7484f194. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.452 187156 DEBUG oslo_concurrency.lockutils [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:14 np0005539504 nova_compute[187152]: 2025-11-29 07:44:14.536 187156 DEBUG nova.network.neutron [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:44:15 np0005539504 nova_compute[187152]: 2025-11-29 07:44:15.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.626 187156 DEBUG nova.network.neutron [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updating instance_info_cache with network_info: [{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.645 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.646 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Instance network_info: |[{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.647 187156 DEBUG oslo_concurrency.lockutils [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.647 187156 DEBUG nova.network.neutron [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Refreshing network info cache for port 341d1042-c21e-4fe5-aacf-9dcf7484f194 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.654 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Start _get_guest_xml network_info=[{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.661 187156 WARNING nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.671 187156 DEBUG nova.virt.libvirt.host [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.672 187156 DEBUG nova.virt.libvirt.host [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.676 187156 DEBUG nova.virt.libvirt.host [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.676 187156 DEBUG nova.virt.libvirt.host [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.678 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.679 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.680 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.680 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.680 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.680 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.681 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.681 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.681 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.681 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.682 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.682 187156 DEBUG nova.virt.hardware [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.687 187156 DEBUG nova.virt.libvirt.vif [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:44:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1683587647',display_name='tempest-TestGettingAddress-server-1683587647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1683587647',id=163,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUa75YDe0UFuF3zC2EDX2ybM5J+nbJmodk4MsTI0LH8wgrY+SnJ+ndJio6A3wgT3MOCb4Huw6Ay1X+CWth0nVP7co5Y+kbzpcJTZjopF6Z8gsZ88jWOxRb0FFz1vDfcNA==',key_name='tempest-TestGettingAddress-1094641829',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-2h59n9hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:44:11Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=b81a6289-ddd4-4126-9a4c-5d697ddb4da1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.688 187156 DEBUG nova.network.os_vif_util [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.689 187156 DEBUG nova.network.os_vif_util [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.690 187156 DEBUG nova.objects.instance [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid b81a6289-ddd4-4126-9a4c-5d697ddb4da1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.707 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <uuid>b81a6289-ddd4-4126-9a4c-5d697ddb4da1</uuid>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <name>instance-000000a3</name>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestGettingAddress-server-1683587647</nova:name>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:44:16</nova:creationTime>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        <nova:port uuid="341d1042-c21e-4fe5-aacf-9dcf7484f194">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fef4:acc3" ipVersion="6"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fef4:acc3" ipVersion="6"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <entry name="serial">b81a6289-ddd4-4126-9a4c-5d697ddb4da1</entry>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <entry name="uuid">b81a6289-ddd4-4126-9a4c-5d697ddb4da1</entry>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.config"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:f4:ac:c3"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <target dev="tap341d1042-c2"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/console.log" append="off"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:44:16 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:44:16 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:44:16 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:44:16 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.709 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Preparing to wait for external event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.709 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.709 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.710 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.711 187156 DEBUG nova.virt.libvirt.vif [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:44:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1683587647',display_name='tempest-TestGettingAddress-server-1683587647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1683587647',id=163,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUa75YDe0UFuF3zC2EDX2ybM5J+nbJmodk4MsTI0LH8wgrY+SnJ+ndJio6A3wgT3MOCb4Huw6Ay1X+CWth0nVP7co5Y+kbzpcJTZjopF6Z8gsZ88jWOxRb0FFz1vDfcNA==',key_name='tempest-TestGettingAddress-1094641829',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-2h59n9hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:44:11Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=b81a6289-ddd4-4126-9a4c-5d697ddb4da1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.711 187156 DEBUG nova.network.os_vif_util [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.712 187156 DEBUG nova.network.os_vif_util [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.712 187156 DEBUG os_vif [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.713 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.713 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.714 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.717 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.717 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap341d1042-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.718 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap341d1042-c2, col_values=(('external_ids', {'iface-id': '341d1042-c21e-4fe5-aacf-9dcf7484f194', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:ac:c3', 'vm-uuid': 'b81a6289-ddd4-4126-9a4c-5d697ddb4da1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.720 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:16 np0005539504 NetworkManager[55210]: <info>  [1764402256.7217] manager: (tap341d1042-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.723 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.730 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.732 187156 INFO os_vif [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2')#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.795 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.796 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.796 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:f4:ac:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:44:16 np0005539504 nova_compute[187152]: 2025-11-29 07:44:16.797 187156 INFO nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Using config drive#033[00m
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.617 187156 INFO nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Creating config drive at /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.config#033[00m
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.625 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84t545dy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.759 187156 DEBUG oslo_concurrency.processutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp84t545dy" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.829 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:17 np0005539504 kernel: tap341d1042-c2: entered promiscuous mode
Nov 29 02:44:17 np0005539504 NetworkManager[55210]: <info>  [1764402257.8463] manager: (tap341d1042-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.848 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:17Z|00661|binding|INFO|Claiming lport 341d1042-c21e-4fe5-aacf-9dcf7484f194 for this chassis.
Nov 29 02:44:17 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:17Z|00662|binding|INFO|341d1042-c21e-4fe5-aacf-9dcf7484f194: Claiming fa:16:3e:f4:ac:c3 10.100.0.5 2001:db8:0:1:f816:3eff:fef4:acc3 2001:db8::f816:3eff:fef4:acc3
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.854 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.861 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:17 np0005539504 nova_compute[187152]: 2025-11-29 07:44:17.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:17 np0005539504 NetworkManager[55210]: <info>  [1764402257.8678] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Nov 29 02:44:17 np0005539504 NetworkManager[55210]: <info>  [1764402257.8685] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.870 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:ac:c3 10.100.0.5 2001:db8:0:1:f816:3eff:fef4:acc3 2001:db8::f816:3eff:fef4:acc3'], port_security=['fa:16:3e:f4:ac:c3 10.100.0.5 2001:db8:0:1:f816:3eff:fef4:acc3 2001:db8::f816:3eff:fef4:acc3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fef4:acc3/64 2001:db8::f816:3eff:fef4:acc3/64', 'neutron:device_id': 'b81a6289-ddd4-4126-9a4c-5d697ddb4da1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fcfda89f-6716-48ad-9493-dabb00233aaf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=341d1042-c21e-4fe5-aacf-9dcf7484f194) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.872 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 341d1042-c21e-4fe5-aacf-9dcf7484f194 in datapath e23e9510-a780-4254-b7f0-36040139e7db bound to our chassis#033[00m
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.873 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e23e9510-a780-4254-b7f0-36040139e7db#033[00m
Nov 29 02:44:17 np0005539504 systemd-udevd[246764]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.887 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ef26d17b-0218-4507-b55f-d51af421acf7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.888 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape23e9510-a1 in ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.891 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape23e9510-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.891 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0f808722-3231-4ff2-8e30-9cdbe98824a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.893 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[38102097-c68a-44dc-94a7-6f0421634a25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:17 np0005539504 NetworkManager[55210]: <info>  [1764402257.8988] device (tap341d1042-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:44:17 np0005539504 NetworkManager[55210]: <info>  [1764402257.8996] device (tap341d1042-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:44:17 np0005539504 systemd-machined[153423]: New machine qemu-85-instance-000000a3.
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.908 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8b096829-1816-4d10-b8aa-ace9574b0f36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:17 np0005539504 systemd[1]: Started Virtual Machine qemu-85-instance-000000a3.
Nov 29 02:44:17 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:17.936 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[529a4226-15b7-4483-91a6-6fa513784499]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.001 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e6361f-328d-40dc-8ecc-e297599b0507]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 NetworkManager[55210]: <info>  [1764402258.0226] manager: (tape23e9510-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/296)
Nov 29 02:44:18 np0005539504 systemd-udevd[246769]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.023 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[552c101d-4d60-4d5f-8d3e-1527be3d1bf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.023 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.052 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:18Z|00663|binding|INFO|Setting lport 341d1042-c21e-4fe5-aacf-9dcf7484f194 ovn-installed in OVS
Nov 29 02:44:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:18Z|00664|binding|INFO|Setting lport 341d1042-c21e-4fe5-aacf-9dcf7484f194 up in Southbound
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.061 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.064 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[5702fb19-5631-4e66-b793-9b65a4bf68c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.068 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[861c7063-1656-4a4e-8d14-2662eff36911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 NetworkManager[55210]: <info>  [1764402258.0977] device (tape23e9510-a0): carrier: link connected
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.104 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f28c8147-6407-400a-ade2-78561e4f2449]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.123 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[18973432-e546-4e82-996a-e6f108a5d9e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape23e9510-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:31:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772097, 'reachable_time': 37203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246800, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.143 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5b90b1-d8a9-42fc-9cf9-349b909e36eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:31c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 772097, 'tstamp': 772097}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246801, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.159 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c0dd3d-4724-40c9-a807-11e7f4e53aa9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape23e9510-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:31:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772097, 'reachable_time': 37203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246802, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.199 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[68344e90-0b3e-4053-b408-d37d141dcfc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.269 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[faa5a393-a3a1-432d-bcee-ca6a1e03b486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.270 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape23e9510-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.271 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.271 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape23e9510-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539504 kernel: tape23e9510-a0: entered promiscuous mode
Nov 29 02:44:18 np0005539504 NetworkManager[55210]: <info>  [1764402258.2746] manager: (tape23e9510-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.276 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape23e9510-a0, col_values=(('external_ids', {'iface-id': '4e85a268-4b8a-4015-a903-2252d696f8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:18Z|00665|binding|INFO|Releasing lport 4e85a268-4b8a-4015-a903-2252d696f8f5 from this chassis (sb_readonly=0)
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.279 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e23e9510-a780-4254-b7f0-36040139e7db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e23e9510-a780-4254-b7f0-36040139e7db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.280 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a477167b-bf2e-477d-8ee3-bfcf57011ec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.281 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-e23e9510-a780-4254-b7f0-36040139e7db
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/e23e9510-a780-4254-b7f0-36040139e7db.pid.haproxy
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID e23e9510-a780-4254-b7f0-36040139e7db
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:44:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:18.283 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'env', 'PROCESS_TAG=haproxy-e23e9510-a780-4254-b7f0-36040139e7db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e23e9510-a780-4254-b7f0-36040139e7db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.298 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.551 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402258.5500698, b81a6289-ddd4-4126-9a4c-5d697ddb4da1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.551 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] VM Started (Lifecycle Event)#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.578 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.584 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402258.550245, b81a6289-ddd4-4126-9a4c-5d697ddb4da1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.584 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.609 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.614 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.634 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:44:18 np0005539504 podman[246841]: 2025-11-29 07:44:18.721308336 +0000 UTC m=+0.072159438 container create 4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.755 187156 DEBUG nova.compute.manager [req-259b3dab-cd25-45c9-bc0b-77b35ddb8ab6 req-4649c7c2-654e-4011-bddd-831d3574160c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.757 187156 DEBUG oslo_concurrency.lockutils [req-259b3dab-cd25-45c9-bc0b-77b35ddb8ab6 req-4649c7c2-654e-4011-bddd-831d3574160c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.757 187156 DEBUG oslo_concurrency.lockutils [req-259b3dab-cd25-45c9-bc0b-77b35ddb8ab6 req-4649c7c2-654e-4011-bddd-831d3574160c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.757 187156 DEBUG oslo_concurrency.lockutils [req-259b3dab-cd25-45c9-bc0b-77b35ddb8ab6 req-4649c7c2-654e-4011-bddd-831d3574160c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.758 187156 DEBUG nova.compute.manager [req-259b3dab-cd25-45c9-bc0b-77b35ddb8ab6 req-4649c7c2-654e-4011-bddd-831d3574160c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Processing event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.758 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.764 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402258.7637784, b81a6289-ddd4-4126-9a4c-5d697ddb4da1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.764 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:44:18 np0005539504 systemd[1]: Started libpod-conmon-4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa.scope.
Nov 29 02:44:18 np0005539504 podman[246841]: 2025-11-29 07:44:18.674082308 +0000 UTC m=+0.024933440 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.768 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.777 187156 INFO nova.virt.libvirt.driver [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Instance spawned successfully.#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.778 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:44:18 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.794 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.797 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:44:18 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b394cc93cb4e3ff1a50799f27bb4f603960f2b46d256128182ae51773a9088b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.809 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.810 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.810 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.811 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.811 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.812 187156 DEBUG nova.virt.libvirt.driver [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:44:18 np0005539504 podman[246841]: 2025-11-29 07:44:18.818709549 +0000 UTC m=+0.169560671 container init 4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:44:18 np0005539504 podman[246841]: 2025-11-29 07:44:18.825543392 +0000 UTC m=+0.176394494 container start 4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 02:44:18 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [NOTICE]   (246860) : New worker (246862) forked
Nov 29 02:44:18 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [NOTICE]   (246860) : Loading success.
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.858 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.956 187156 INFO nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Took 6.89 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:44:18 np0005539504 nova_compute[187152]: 2025-11-29 07:44:18.957 187156 DEBUG nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.095 187156 INFO nova.compute.manager [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Took 7.53 seconds to build instance.#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.115 187156 DEBUG oslo_concurrency.lockutils [None req-8e9d8390-4250-48c8-8f88-cc57240df770 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.116 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402244.1158988, 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.117 187156 INFO nova.compute.manager [-] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.138 187156 DEBUG nova.compute.manager [None req-ab5c5157-c81d-4c02-8470-3fa0be6bb678 - - - - - -] [instance: 5b1eba03-8be0-4f33-a4d6-8c0751ddd10b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:44:19 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:19.857 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.973 187156 DEBUG nova.network.neutron [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updated VIF entry in instance network info cache for port 341d1042-c21e-4fe5-aacf-9dcf7484f194. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.974 187156 DEBUG nova.network.neutron [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updating instance_info_cache with network_info: [{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:19 np0005539504 nova_compute[187152]: 2025-11-29 07:44:19.997 187156 DEBUG oslo_concurrency.lockutils [req-68579151-6b06-4c11-b162-1cd85e902fa5 req-e5d47bd5-4dac-4781-b5b4-8b622fc192cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:20 np0005539504 podman[246871]: 2025-11-29 07:44:20.735822315 +0000 UTC m=+0.074308284 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:44:20 np0005539504 podman[246872]: 2025-11-29 07:44:20.736301098 +0000 UTC m=+0.073349709 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, release=1755695350, version=9.6)
Nov 29 02:44:20 np0005539504 podman[246873]: 2025-11-29 07:44:20.760753574 +0000 UTC m=+0.091440025 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 02:44:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:20Z|00666|binding|INFO|Releasing lport 4e85a268-4b8a-4015-a903-2252d696f8f5 from this chassis (sb_readonly=0)
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.790 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.934 187156 DEBUG nova.compute.manager [req-f770ab90-5ca2-46c2-bbb5-2754cd3eb69f req-9de786e4-0c74-4098-ad9e-83de12fe31c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.934 187156 DEBUG oslo_concurrency.lockutils [req-f770ab90-5ca2-46c2-bbb5-2754cd3eb69f req-9de786e4-0c74-4098-ad9e-83de12fe31c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.935 187156 DEBUG oslo_concurrency.lockutils [req-f770ab90-5ca2-46c2-bbb5-2754cd3eb69f req-9de786e4-0c74-4098-ad9e-83de12fe31c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.935 187156 DEBUG oslo_concurrency.lockutils [req-f770ab90-5ca2-46c2-bbb5-2754cd3eb69f req-9de786e4-0c74-4098-ad9e-83de12fe31c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.935 187156 DEBUG nova.compute.manager [req-f770ab90-5ca2-46c2-bbb5-2754cd3eb69f req-9de786e4-0c74-4098-ad9e-83de12fe31c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] No waiting events found dispatching network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:20 np0005539504 nova_compute[187152]: 2025-11-29 07:44:20.935 187156 WARNING nova.compute.manager [req-f770ab90-5ca2-46c2-bbb5-2754cd3eb69f req-9de786e4-0c74-4098-ad9e-83de12fe31c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received unexpected event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:44:21 np0005539504 nova_compute[187152]: 2025-11-29 07:44:21.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:21 np0005539504 nova_compute[187152]: 2025-11-29 07:44:21.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:22 np0005539504 nova_compute[187152]: 2025-11-29 07:44:22.833 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:23.482 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:23.483 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:23.484 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:24 np0005539504 podman[246930]: 2025-11-29 07:44:24.723755932 +0000 UTC m=+0.064456610 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:44:24 np0005539504 podman[246931]: 2025-11-29 07:44:24.772679235 +0000 UTC m=+0.110548427 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:44:26 np0005539504 nova_compute[187152]: 2025-11-29 07:44:26.724 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:26 np0005539504 nova_compute[187152]: 2025-11-29 07:44:26.731 187156 DEBUG nova.compute.manager [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-changed-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:26 np0005539504 nova_compute[187152]: 2025-11-29 07:44:26.731 187156 DEBUG nova.compute.manager [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Refreshing instance network info cache due to event network-changed-341d1042-c21e-4fe5-aacf-9dcf7484f194. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:44:26 np0005539504 nova_compute[187152]: 2025-11-29 07:44:26.732 187156 DEBUG oslo_concurrency.lockutils [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:26 np0005539504 nova_compute[187152]: 2025-11-29 07:44:26.732 187156 DEBUG oslo_concurrency.lockutils [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:26 np0005539504 nova_compute[187152]: 2025-11-29 07:44:26.732 187156 DEBUG nova.network.neutron [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Refreshing network info cache for port 341d1042-c21e-4fe5-aacf-9dcf7484f194 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:44:27 np0005539504 nova_compute[187152]: 2025-11-29 07:44:27.869 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:27 np0005539504 nova_compute[187152]: 2025-11-29 07:44:27.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:27 np0005539504 nova_compute[187152]: 2025-11-29 07:44:27.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:44:30 np0005539504 nova_compute[187152]: 2025-11-29 07:44:30.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:31 np0005539504 nova_compute[187152]: 2025-11-29 07:44:31.728 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:32 np0005539504 nova_compute[187152]: 2025-11-29 07:44:32.871 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:33 np0005539504 podman[246998]: 2025-11-29 07:44:33.737488222 +0000 UTC m=+0.073602687 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:44:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:34Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:ac:c3 10.100.0.5
Nov 29 02:44:34 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:34Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:ac:c3 10.100.0.5
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.551 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.552 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.552 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.552 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.644 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.715 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.716 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.781 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.949 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.950 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5536MB free_disk=72.97868728637695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.950 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:34 np0005539504 nova_compute[187152]: 2025-11-29 07:44:34.951 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.080 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance b81a6289-ddd4-4126-9a4c-5d697ddb4da1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.081 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.081 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.130 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.148 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.172 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:44:35 np0005539504 nova_compute[187152]: 2025-11-29 07:44:35.173 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:36 np0005539504 nova_compute[187152]: 2025-11-29 07:44:36.174 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:36 np0005539504 nova_compute[187152]: 2025-11-29 07:44:36.175 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:44:36 np0005539504 nova_compute[187152]: 2025-11-29 07:44:36.175 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:44:36 np0005539504 nova_compute[187152]: 2025-11-29 07:44:36.637 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:36 np0005539504 nova_compute[187152]: 2025-11-29 07:44:36.731 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:37 np0005539504 nova_compute[187152]: 2025-11-29 07:44:37.872 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.666 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:39 np0005539504 podman[247026]: 2025-11-29 07:44:39.720354002 +0000 UTC m=+0.063408063 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.959 187156 DEBUG nova.network.neutron [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updated VIF entry in instance network info cache for port 341d1042-c21e-4fe5-aacf-9dcf7484f194. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.960 187156 DEBUG nova.network.neutron [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updating instance_info_cache with network_info: [{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.984 187156 DEBUG oslo_concurrency.lockutils [req-c8a167bf-1626-4a6c-8e58-d547c8918065 req-96c3a46f-fad2-4235-9edd-dcf0afa0f0f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.985 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.985 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:44:39 np0005539504 nova_compute[187152]: 2025-11-29 07:44:39.986 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid b81a6289-ddd4-4126-9a4c-5d697ddb4da1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:41 np0005539504 nova_compute[187152]: 2025-11-29 07:44:41.735 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:42 np0005539504 nova_compute[187152]: 2025-11-29 07:44:42.874 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:43 np0005539504 nova_compute[187152]: 2025-11-29 07:44:43.665 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updating instance_info_cache with network_info: [{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:43 np0005539504 nova_compute[187152]: 2025-11-29 07:44:43.689 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:43 np0005539504 nova_compute[187152]: 2025-11-29 07:44:43.689 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:44:43 np0005539504 nova_compute[187152]: 2025-11-29 07:44:43.690 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:43 np0005539504 nova_compute[187152]: 2025-11-29 07:44:43.690 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:43 np0005539504 nova_compute[187152]: 2025-11-29 07:44:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.386 187156 DEBUG nova.compute.manager [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-changed-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.387 187156 DEBUG nova.compute.manager [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Refreshing instance network info cache due to event network-changed-341d1042-c21e-4fe5-aacf-9dcf7484f194. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.387 187156 DEBUG oslo_concurrency.lockutils [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.387 187156 DEBUG oslo_concurrency.lockutils [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.387 187156 DEBUG nova.network.neutron [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Refreshing network info cache for port 341d1042-c21e-4fe5-aacf-9dcf7484f194 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.467 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.468 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.468 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.468 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.469 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.483 187156 INFO nova.compute.manager [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Terminating instance#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.493 187156 DEBUG nova.compute.manager [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:44:44 np0005539504 kernel: tap341d1042-c2 (unregistering): left promiscuous mode
Nov 29 02:44:44 np0005539504 NetworkManager[55210]: <info>  [1764402284.5173] device (tap341d1042-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.564 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:44Z|00667|binding|INFO|Releasing lport 341d1042-c21e-4fe5-aacf-9dcf7484f194 from this chassis (sb_readonly=0)
Nov 29 02:44:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:44Z|00668|binding|INFO|Setting lport 341d1042-c21e-4fe5-aacf-9dcf7484f194 down in Southbound
Nov 29 02:44:44 np0005539504 ovn_controller[95182]: 2025-11-29T07:44:44Z|00669|binding|INFO|Removing iface tap341d1042-c2 ovn-installed in OVS
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.566 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.573 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:ac:c3 10.100.0.5 2001:db8:0:1:f816:3eff:fef4:acc3 2001:db8::f816:3eff:fef4:acc3'], port_security=['fa:16:3e:f4:ac:c3 10.100.0.5 2001:db8:0:1:f816:3eff:fef4:acc3 2001:db8::f816:3eff:fef4:acc3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fef4:acc3/64 2001:db8::f816:3eff:fef4:acc3/64', 'neutron:device_id': 'b81a6289-ddd4-4126-9a4c-5d697ddb4da1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fcfda89f-6716-48ad-9493-dabb00233aaf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=341d1042-c21e-4fe5-aacf-9dcf7484f194) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.575 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 341d1042-c21e-4fe5-aacf-9dcf7484f194 in datapath e23e9510-a780-4254-b7f0-36040139e7db unbound from our chassis#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.576 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.578 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ae958852-ee0b-40b7-a4b8-6750f78d3fae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.579 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db namespace which is not needed anymore#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.586 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a3.scope: Deactivated successfully.
Nov 29 02:44:44 np0005539504 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a3.scope: Consumed 14.469s CPU time.
Nov 29 02:44:44 np0005539504 systemd-machined[153423]: Machine qemu-85-instance-000000a3 terminated.
Nov 29 02:44:44 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [NOTICE]   (246860) : haproxy version is 2.8.14-c23fe91
Nov 29 02:44:44 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [NOTICE]   (246860) : path to executable is /usr/sbin/haproxy
Nov 29 02:44:44 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [WARNING]  (246860) : Exiting Master process...
Nov 29 02:44:44 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [WARNING]  (246860) : Exiting Master process...
Nov 29 02:44:44 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [ALERT]    (246860) : Current worker (246862) exited with code 143 (Terminated)
Nov 29 02:44:44 np0005539504 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[246856]: [WARNING]  (246860) : All workers exited. Exiting... (0)
Nov 29 02:44:44 np0005539504 systemd[1]: libpod-4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa.scope: Deactivated successfully.
Nov 29 02:44:44 np0005539504 podman[247069]: 2025-11-29 07:44:44.709758568 +0000 UTC m=+0.040403666 container died 4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 02:44:44 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa-userdata-shm.mount: Deactivated successfully.
Nov 29 02:44:44 np0005539504 systemd[1]: var-lib-containers-storage-overlay-7b394cc93cb4e3ff1a50799f27bb4f603960f2b46d256128182ae51773a9088b-merged.mount: Deactivated successfully.
Nov 29 02:44:44 np0005539504 podman[247069]: 2025-11-29 07:44:44.747346836 +0000 UTC m=+0.077991934 container cleanup 4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.761 187156 INFO nova.virt.libvirt.driver [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Instance destroyed successfully.#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.762 187156 DEBUG nova.objects.instance [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid b81a6289-ddd4-4126-9a4c-5d697ddb4da1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:44 np0005539504 systemd[1]: libpod-conmon-4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa.scope: Deactivated successfully.
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.774 187156 DEBUG nova.virt.libvirt.vif [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:44:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1683587647',display_name='tempest-TestGettingAddress-server-1683587647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1683587647',id=163,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUa75YDe0UFuF3zC2EDX2ybM5J+nbJmodk4MsTI0LH8wgrY+SnJ+ndJio6A3wgT3MOCb4Huw6Ay1X+CWth0nVP7co5Y+kbzpcJTZjopF6Z8gsZ88jWOxRb0FFz1vDfcNA==',key_name='tempest-TestGettingAddress-1094641829',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:44:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-2h59n9hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:44:19Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=b81a6289-ddd4-4126-9a4c-5d697ddb4da1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.774 187156 DEBUG nova.network.os_vif_util [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.775 187156 DEBUG nova.network.os_vif_util [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.775 187156 DEBUG os_vif [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.778 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.779 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap341d1042-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.780 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.782 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.786 187156 INFO os_vif [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:ac:c3,bridge_name='br-int',has_traffic_filtering=True,id=341d1042-c21e-4fe5-aacf-9dcf7484f194,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap341d1042-c2')#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.787 187156 INFO nova.virt.libvirt.driver [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Deleting instance files /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1_del#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.787 187156 INFO nova.virt.libvirt.driver [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Deletion of /var/lib/nova/instances/b81a6289-ddd4-4126-9a4c-5d697ddb4da1_del complete#033[00m
Nov 29 02:44:44 np0005539504 podman[247113]: 2025-11-29 07:44:44.808779935 +0000 UTC m=+0.041868015 container remove 4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.814 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[09acb810-d8da-48c6-bfdf-bc6a590dd5bd]: (4, ('Sat Nov 29 07:44:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db (4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa)\n4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa\nSat Nov 29 07:44:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db (4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa)\n4a262355ee9a3fc5801ebd5fa32589cd624b1224f0cadf1d7ab60063f8f9a1fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.815 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[50773280-957c-4464-84b8-b9b4175c637a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.817 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape23e9510-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.819 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 kernel: tape23e9510-a0: left promiscuous mode
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.831 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.835 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae43289-7b35-4964-b3ff-be9f19cf7d13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.849 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[aea3454d-dab1-499d-b3a8-122d9cdfbac5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.850 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[171f2a7b-3473-4514-a11d-98e6b2f22270]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.867 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2bfa0e25-6e9d-4c7f-812d-e9d7c2efa55f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772086, 'reachable_time': 33198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247129, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.870 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:44:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:44:44.870 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[84ffd440-9e29-43a8-94b9-4dce23b5e8e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:44 np0005539504 systemd[1]: run-netns-ovnmeta\x2de23e9510\x2da780\x2d4254\x2db7f0\x2d36040139e7db.mount: Deactivated successfully.
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.872 187156 INFO nova.compute.manager [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.873 187156 DEBUG oslo.service.loopingcall [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.873 187156 DEBUG nova.compute.manager [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:44:44 np0005539504 nova_compute[187152]: 2025-11-29 07:44:44.873 187156 DEBUG nova.network.neutron [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:44:45 np0005539504 nova_compute[187152]: 2025-11-29 07:44:45.771 187156 DEBUG nova.compute.manager [req-7a4b0839-d73b-49cb-9742-e4ab65ad3550 req-ac86836e-fd5e-4c11-a188-5d329e39231f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-vif-unplugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:45 np0005539504 nova_compute[187152]: 2025-11-29 07:44:45.772 187156 DEBUG oslo_concurrency.lockutils [req-7a4b0839-d73b-49cb-9742-e4ab65ad3550 req-ac86836e-fd5e-4c11-a188-5d329e39231f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:45 np0005539504 nova_compute[187152]: 2025-11-29 07:44:45.772 187156 DEBUG oslo_concurrency.lockutils [req-7a4b0839-d73b-49cb-9742-e4ab65ad3550 req-ac86836e-fd5e-4c11-a188-5d329e39231f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:45 np0005539504 nova_compute[187152]: 2025-11-29 07:44:45.772 187156 DEBUG oslo_concurrency.lockutils [req-7a4b0839-d73b-49cb-9742-e4ab65ad3550 req-ac86836e-fd5e-4c11-a188-5d329e39231f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:45 np0005539504 nova_compute[187152]: 2025-11-29 07:44:45.772 187156 DEBUG nova.compute.manager [req-7a4b0839-d73b-49cb-9742-e4ab65ad3550 req-ac86836e-fd5e-4c11-a188-5d329e39231f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] No waiting events found dispatching network-vif-unplugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:45 np0005539504 nova_compute[187152]: 2025-11-29 07:44:45.773 187156 DEBUG nova.compute.manager [req-7a4b0839-d73b-49cb-9742-e4ab65ad3550 req-ac86836e-fd5e-4c11-a188-5d329e39231f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-vif-unplugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.201 187156 DEBUG nova.network.neutron [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.232 187156 INFO nova.compute.manager [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Took 1.36 seconds to deallocate network for instance.#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.321 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.322 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.381 187156 DEBUG nova.compute.provider_tree [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.395 187156 DEBUG nova.scheduler.client.report [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.416 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.443 187156 INFO nova.scheduler.client.report [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance b81a6289-ddd4-4126-9a4c-5d697ddb4da1#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.507 187156 DEBUG nova.compute.manager [req-c4f19b77-7134-409f-a143-39a19a6b8cbb req-295ba9eb-e692-4303-9aed-f3b032f5462d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-vif-deleted-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.554 187156 DEBUG oslo_concurrency.lockutils [None req-50e02eee-ebee-4e0c-9c37-f13b0176bd2b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.719 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:46 np0005539504 nova_compute[187152]: 2025-11-29 07:44:46.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:47 np0005539504 nova_compute[187152]: 2025-11-29 07:44:47.646 187156 DEBUG nova.network.neutron [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updated VIF entry in instance network info cache for port 341d1042-c21e-4fe5-aacf-9dcf7484f194. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:44:47 np0005539504 nova_compute[187152]: 2025-11-29 07:44:47.646 187156 DEBUG nova.network.neutron [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Updating instance_info_cache with network_info: [{"id": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "address": "fa:16:3e:f4:ac:c3", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fef4:acc3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap341d1042-c2", "ovs_interfaceid": "341d1042-c21e-4fe5-aacf-9dcf7484f194", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:47 np0005539504 nova_compute[187152]: 2025-11-29 07:44:47.673 187156 DEBUG oslo_concurrency.lockutils [req-a431d6ea-f852-464b-bed3-39f101845862 req-6fea7b09-3291-41cc-a08f-05882085b394 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b81a6289-ddd4-4126-9a4c-5d697ddb4da1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:47 np0005539504 nova_compute[187152]: 2025-11-29 07:44:47.876 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:44:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539504 nova_compute[187152]: 2025-11-29 07:44:48.392 187156 DEBUG nova.compute.manager [req-0614698e-41b2-404b-9c06-3395f4aaf74e req-27ef615d-7ba5-4ed0-945c-9966769b3388 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:48 np0005539504 nova_compute[187152]: 2025-11-29 07:44:48.392 187156 DEBUG oslo_concurrency.lockutils [req-0614698e-41b2-404b-9c06-3395f4aaf74e req-27ef615d-7ba5-4ed0-945c-9966769b3388 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:48 np0005539504 nova_compute[187152]: 2025-11-29 07:44:48.393 187156 DEBUG oslo_concurrency.lockutils [req-0614698e-41b2-404b-9c06-3395f4aaf74e req-27ef615d-7ba5-4ed0-945c-9966769b3388 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:48 np0005539504 nova_compute[187152]: 2025-11-29 07:44:48.393 187156 DEBUG oslo_concurrency.lockutils [req-0614698e-41b2-404b-9c06-3395f4aaf74e req-27ef615d-7ba5-4ed0-945c-9966769b3388 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b81a6289-ddd4-4126-9a4c-5d697ddb4da1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:48 np0005539504 nova_compute[187152]: 2025-11-29 07:44:48.393 187156 DEBUG nova.compute.manager [req-0614698e-41b2-404b-9c06-3395f4aaf74e req-27ef615d-7ba5-4ed0-945c-9966769b3388 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] No waiting events found dispatching network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:48 np0005539504 nova_compute[187152]: 2025-11-29 07:44:48.393 187156 WARNING nova.compute.manager [req-0614698e-41b2-404b-9c06-3395f4aaf74e req-27ef615d-7ba5-4ed0-945c-9966769b3388 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Received unexpected event network-vif-plugged-341d1042-c21e-4fe5-aacf-9dcf7484f194 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:44:49 np0005539504 nova_compute[187152]: 2025-11-29 07:44:49.783 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:51 np0005539504 podman[247131]: 2025-11-29 07:44:51.712516612 +0000 UTC m=+0.056111316 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:44:51 np0005539504 podman[247132]: 2025-11-29 07:44:51.71878859 +0000 UTC m=+0.062237180 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:44:51 np0005539504 podman[247133]: 2025-11-29 07:44:51.745799464 +0000 UTC m=+0.085706780 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:44:52 np0005539504 nova_compute[187152]: 2025-11-29 07:44:52.878 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:54 np0005539504 nova_compute[187152]: 2025-11-29 07:44:54.786 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:55 np0005539504 podman[247192]: 2025-11-29 07:44:55.706302445 +0000 UTC m=+0.054457552 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:44:56 np0005539504 podman[247215]: 2025-11-29 07:44:56.764660471 +0000 UTC m=+0.101779872 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:44:57 np0005539504 nova_compute[187152]: 2025-11-29 07:44:57.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:59 np0005539504 nova_compute[187152]: 2025-11-29 07:44:59.503 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:59 np0005539504 nova_compute[187152]: 2025-11-29 07:44:59.678 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:59 np0005539504 nova_compute[187152]: 2025-11-29 07:44:59.762 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402284.7604446, b81a6289-ddd4-4126-9a4c-5d697ddb4da1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:44:59 np0005539504 nova_compute[187152]: 2025-11-29 07:44:59.762 187156 INFO nova.compute.manager [-] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:44:59 np0005539504 nova_compute[187152]: 2025-11-29 07:44:59.788 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:59 np0005539504 nova_compute[187152]: 2025-11-29 07:44:59.795 187156 DEBUG nova.compute.manager [None req-33af0137-8923-4fbe-8a29-a3eeab751153 - - - - - -] [instance: b81a6289-ddd4-4126-9a4c-5d697ddb4da1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:02 np0005539504 nova_compute[187152]: 2025-11-29 07:45:02.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:04 np0005539504 podman[247244]: 2025-11-29 07:45:04.733444793 +0000 UTC m=+0.076834232 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 29 02:45:04 np0005539504 nova_compute[187152]: 2025-11-29 07:45:04.792 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:07 np0005539504 nova_compute[187152]: 2025-11-29 07:45:07.883 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:09 np0005539504 nova_compute[187152]: 2025-11-29 07:45:09.795 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:10 np0005539504 podman[247266]: 2025-11-29 07:45:10.132232173 +0000 UTC m=+0.088965448 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:45:12 np0005539504 nova_compute[187152]: 2025-11-29 07:45:12.885 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:13 np0005539504 nova_compute[187152]: 2025-11-29 07:45:13.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:14.442 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:14 np0005539504 nova_compute[187152]: 2025-11-29 07:45:14.443 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:14.444 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:45:14 np0005539504 nova_compute[187152]: 2025-11-29 07:45:14.798 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:16 np0005539504 nova_compute[187152]: 2025-11-29 07:45:16.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:17 np0005539504 nova_compute[187152]: 2025-11-29 07:45:17.887 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:18.445 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:19 np0005539504 nova_compute[187152]: 2025-11-29 07:45:19.801 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:22 np0005539504 podman[247289]: 2025-11-29 07:45:22.705805031 +0000 UTC m=+0.051320427 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:45:22 np0005539504 podman[247291]: 2025-11-29 07:45:22.710308693 +0000 UTC m=+0.048294027 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:45:22 np0005539504 podman[247290]: 2025-11-29 07:45:22.739890856 +0000 UTC m=+0.081228880 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:45:22 np0005539504 nova_compute[187152]: 2025-11-29 07:45:22.890 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:23.483 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:23.483 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:23.483 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:23 np0005539504 nova_compute[187152]: 2025-11-29 07:45:23.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:24 np0005539504 nova_compute[187152]: 2025-11-29 07:45:24.805 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:26 np0005539504 podman[247353]: 2025-11-29 07:45:26.747366457 +0000 UTC m=+0.091323871 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:45:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:27.101 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:bd:9f 10.100.0.2 2001:db8::f816:3eff:fee1:bd9f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee1:bd9f/64', 'neutron:device_id': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=27074c74-d81e-4dc1-9e05-b59b6b9a0624) old=Port_Binding(mac=['fa:16:3e:e1:bd:9f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:27.103 104164 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 27074c74-d81e-4dc1-9e05-b59b6b9a0624 in datapath 7b412a37-c227-42ad-9fca-23287613486a updated#033[00m
Nov 29 02:45:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:27.105 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b412a37-c227-42ad-9fca-23287613486a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:45:27 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:45:27.106 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[860b9d1d-85f3-4057-8962-bac600a05923]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:27 np0005539504 podman[247378]: 2025-11-29 07:45:27.754513098 +0000 UTC m=+0.092014239 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 02:45:27 np0005539504 nova_compute[187152]: 2025-11-29 07:45:27.892 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:27 np0005539504 nova_compute[187152]: 2025-11-29 07:45:27.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:27 np0005539504 nova_compute[187152]: 2025-11-29 07:45:27.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:45:29 np0005539504 nova_compute[187152]: 2025-11-29 07:45:29.810 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:31 np0005539504 nova_compute[187152]: 2025-11-29 07:45:31.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:32 np0005539504 nova_compute[187152]: 2025-11-29 07:45:32.894 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.535 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.536 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.536 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.537 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.684 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.686 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5715MB free_disk=73.0064582824707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.686 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.686 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.813 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.849 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:45:34 np0005539504 nova_compute[187152]: 2025-11-29 07:45:34.850 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:45:35 np0005539504 nova_compute[187152]: 2025-11-29 07:45:35.290 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:45:35 np0005539504 nova_compute[187152]: 2025-11-29 07:45:35.604 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:45:35 np0005539504 nova_compute[187152]: 2025-11-29 07:45:35.696 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:45:35 np0005539504 nova_compute[187152]: 2025-11-29 07:45:35.696 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:35 np0005539504 podman[247407]: 2025-11-29 07:45:35.7210177 +0000 UTC m=+0.063835984 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:45:36 np0005539504 nova_compute[187152]: 2025-11-29 07:45:36.698 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:36 np0005539504 nova_compute[187152]: 2025-11-29 07:45:36.698 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:45:36 np0005539504 nova_compute[187152]: 2025-11-29 07:45:36.698 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:45:37 np0005539504 nova_compute[187152]: 2025-11-29 07:45:37.896 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:38 np0005539504 nova_compute[187152]: 2025-11-29 07:45:38.863 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:45:38 np0005539504 nova_compute[187152]: 2025-11-29 07:45:38.863 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:38 np0005539504 nova_compute[187152]: 2025-11-29 07:45:38.863 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:39 np0005539504 nova_compute[187152]: 2025-11-29 07:45:39.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.457 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.458 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.502 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.637 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.637 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.645 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.645 187156 INFO nova.compute.claims [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:45:40 np0005539504 podman[247427]: 2025-11-29 07:45:40.723316732 +0000 UTC m=+0.057241927 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.860 187156 DEBUG nova.compute.provider_tree [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.881 187156 DEBUG nova.scheduler.client.report [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.918 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:40 np0005539504 nova_compute[187152]: 2025-11-29 07:45:40.919 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:45:42 np0005539504 nova_compute[187152]: 2025-11-29 07:45:42.897 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:44 np0005539504 nova_compute[187152]: 2025-11-29 07:45:44.818 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:44 np0005539504 nova_compute[187152]: 2025-11-29 07:45:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:45 np0005539504 nova_compute[187152]: 2025-11-29 07:45:45.952 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:45:45 np0005539504 nova_compute[187152]: 2025-11-29 07:45:45.953 187156 DEBUG nova.network.neutron [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:45:47 np0005539504 nova_compute[187152]: 2025-11-29 07:45:47.898 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:49 np0005539504 nova_compute[187152]: 2025-11-29 07:45:49.822 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:51 np0005539504 nova_compute[187152]: 2025-11-29 07:45:51.343 187156 INFO nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:45:51 np0005539504 nova_compute[187152]: 2025-11-29 07:45:51.473 187156 DEBUG nova.policy [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:45:51 np0005539504 nova_compute[187152]: 2025-11-29 07:45:51.763 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:45:52 np0005539504 nova_compute[187152]: 2025-11-29 07:45:52.900 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:53 np0005539504 podman[247455]: 2025-11-29 07:45:53.710626612 +0000 UTC m=+0.058178421 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-type=git)
Nov 29 02:45:53 np0005539504 podman[247454]: 2025-11-29 07:45:53.728275326 +0000 UTC m=+0.078760563 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:45:53 np0005539504 podman[247456]: 2025-11-29 07:45:53.737235036 +0000 UTC m=+0.079328098 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 29 02:45:54 np0005539504 nova_compute[187152]: 2025-11-29 07:45:54.826 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:55 np0005539504 ovn_controller[95182]: 2025-11-29T07:45:55Z|00670|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 29 02:45:57 np0005539504 podman[247518]: 2025-11-29 07:45:57.718142365 +0000 UTC m=+0.059827076 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:45:57 np0005539504 nova_compute[187152]: 2025-11-29 07:45:57.901 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:58 np0005539504 podman[247542]: 2025-11-29 07:45:58.759036502 +0000 UTC m=+0.095725909 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:45:59 np0005539504 nova_compute[187152]: 2025-11-29 07:45:59.830 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.721 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.722 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.723 187156 INFO nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Creating image(s)#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.724 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.724 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.725 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.739 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.811 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.812 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.813 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.824 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.886 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.888 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.933 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.935 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.936 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.997 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.998 187156 DEBUG nova.virt.disk.api [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:46:00 np0005539504 nova_compute[187152]: 2025-11-29 07:46:00.999 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:01 np0005539504 nova_compute[187152]: 2025-11-29 07:46:01.056 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:01 np0005539504 nova_compute[187152]: 2025-11-29 07:46:01.057 187156 DEBUG nova.virt.disk.api [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:46:01 np0005539504 nova_compute[187152]: 2025-11-29 07:46:01.057 187156 DEBUG nova.objects.instance [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 1252d970-730d-4caf-8395-8124fb080e82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:46:02 np0005539504 nova_compute[187152]: 2025-11-29 07:46:02.902 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:03.067 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:03 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:03.069 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.110 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.110 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Ensure instance console log exists: /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.111 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.111 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.112 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:03 np0005539504 nova_compute[187152]: 2025-11-29 07:46:03.679 187156 DEBUG nova.network.neutron [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Successfully created port: 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:46:04 np0005539504 nova_compute[187152]: 2025-11-29 07:46:04.833 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:06 np0005539504 podman[247583]: 2025-11-29 07:46:06.724777742 +0000 UTC m=+0.060039032 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:46:07 np0005539504 nova_compute[187152]: 2025-11-29 07:46:07.686 187156 DEBUG nova.network.neutron [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Successfully updated port: 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:46:07 np0005539504 nova_compute[187152]: 2025-11-29 07:46:07.904 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:09 np0005539504 nova_compute[187152]: 2025-11-29 07:46:09.837 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:10 np0005539504 nova_compute[187152]: 2025-11-29 07:46:10.934 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:46:10 np0005539504 nova_compute[187152]: 2025-11-29 07:46:10.934 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:46:10 np0005539504 nova_compute[187152]: 2025-11-29 07:46:10.934 187156 DEBUG nova.network.neutron [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:46:11 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:11.072 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:11 np0005539504 nova_compute[187152]: 2025-11-29 07:46:11.420 187156 DEBUG nova.compute.manager [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-changed-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:11 np0005539504 nova_compute[187152]: 2025-11-29 07:46:11.420 187156 DEBUG nova.compute.manager [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Refreshing instance network info cache due to event network-changed-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:46:11 np0005539504 nova_compute[187152]: 2025-11-29 07:46:11.421 187156 DEBUG oslo_concurrency.lockutils [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:46:11 np0005539504 podman[247605]: 2025-11-29 07:46:11.709326288 +0000 UTC m=+0.057972466 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:46:11 np0005539504 nova_compute[187152]: 2025-11-29 07:46:11.715 187156 DEBUG nova.network.neutron [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:46:12 np0005539504 nova_compute[187152]: 2025-11-29 07:46:12.906 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.939 187156 DEBUG nova.network.neutron [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updating instance_info_cache with network_info: [{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.964 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.964 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Instance network_info: |[{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.965 187156 DEBUG oslo_concurrency.lockutils [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.965 187156 DEBUG nova.network.neutron [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Refreshing network info cache for port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.969 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Start _get_guest_xml network_info=[{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.974 187156 WARNING nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.984 187156 DEBUG nova.virt.libvirt.host [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.985 187156 DEBUG nova.virt.libvirt.host [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.993 187156 DEBUG nova.virt.libvirt.host [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.994 187156 DEBUG nova.virt.libvirt.host [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.995 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.995 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.995 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.996 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.996 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.996 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.996 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.996 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.997 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.997 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.997 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:46:13 np0005539504 nova_compute[187152]: 2025-11-29 07:46:13.998 187156 DEBUG nova.virt.hardware [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.002 187156 DEBUG nova.virt.libvirt.vif [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:45:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1708707331',display_name='tempest-TestGettingAddress-server-1708707331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1708707331',id=166,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRqMqB2m2OgDWFFhqrQomXOlqtsH3DkZX/q3f9H1IQ0ObpMW22Tv9hUlgFTK1dOU/3/nBcrYC6MrtFKRuaytFioBJb/QmB1UkysPgqPE38bvZ1GGYFs/tP1vPRobYVgdQ==',key_name='tempest-TestGettingAddress-277961909',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-t1tym0mr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:45:55Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=1252d970-730d-4caf-8395-8124fb080e82,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.002 187156 DEBUG nova.network.os_vif_util [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.003 187156 DEBUG nova.network.os_vif_util [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.004 187156 DEBUG nova.objects.instance [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1252d970-730d-4caf-8395-8124fb080e82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.275 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <uuid>1252d970-730d-4caf-8395-8124fb080e82</uuid>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <name>instance-000000a6</name>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestGettingAddress-server-1708707331</nova:name>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:46:13</nova:creationTime>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        <nova:port uuid="6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe9a:e788" ipVersion="6"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <entry name="serial">1252d970-730d-4caf-8395-8124fb080e82</entry>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <entry name="uuid">1252d970-730d-4caf-8395-8124fb080e82</entry>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.config"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:9a:e7:88"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <target dev="tap6ee81ad6-6e"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/console.log" append="off"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:46:14 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:46:14 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:46:14 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:46:14 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.277 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Preparing to wait for external event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.277 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.277 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.278 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.278 187156 DEBUG nova.virt.libvirt.vif [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:45:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1708707331',display_name='tempest-TestGettingAddress-server-1708707331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1708707331',id=166,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRqMqB2m2OgDWFFhqrQomXOlqtsH3DkZX/q3f9H1IQ0ObpMW22Tv9hUlgFTK1dOU/3/nBcrYC6MrtFKRuaytFioBJb/QmB1UkysPgqPE38bvZ1GGYFs/tP1vPRobYVgdQ==',key_name='tempest-TestGettingAddress-277961909',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-t1tym0mr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:45:55Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=1252d970-730d-4caf-8395-8124fb080e82,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.279 187156 DEBUG nova.network.os_vif_util [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.280 187156 DEBUG nova.network.os_vif_util [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.280 187156 DEBUG os_vif [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.281 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.281 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.281 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.286 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.286 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ee81ad6-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.287 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6ee81ad6-6e, col_values=(('external_ids', {'iface-id': '6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:e7:88', 'vm-uuid': '1252d970-730d-4caf-8395-8124fb080e82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.333 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:14 np0005539504 NetworkManager[55210]: <info>  [1764402374.3345] manager: (tap6ee81ad6-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.335 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.340 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.341 187156 INFO os_vif [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e')#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.413 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.413 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.414 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:9a:e7:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:46:14 np0005539504 nova_compute[187152]: 2025-11-29 07:46:14.414 187156 INFO nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Using config drive#033[00m
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.586 187156 INFO nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Creating config drive at /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.config#033[00m
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.591 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgvpenqz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.719 187156 DEBUG oslo_concurrency.processutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgvpenqz" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:15 np0005539504 kernel: tap6ee81ad6-6e: entered promiscuous mode
Nov 29 02:46:15 np0005539504 NetworkManager[55210]: <info>  [1764402375.7947] manager: (tap6ee81ad6-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Nov 29 02:46:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:15Z|00671|binding|INFO|Claiming lport 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce for this chassis.
Nov 29 02:46:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:15Z|00672|binding|INFO|6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce: Claiming fa:16:3e:9a:e7:88 10.100.0.12 2001:db8::f816:3eff:fe9a:e788
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.796 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.798 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.803 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.814 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:e7:88 10.100.0.12 2001:db8::f816:3eff:fe9a:e788'], port_security=['fa:16:3e:9a:e7:88 10.100.0.12 2001:db8::f816:3eff:fe9a:e788'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8::f816:3eff:fe9a:e788/64', 'neutron:device_id': '1252d970-730d-4caf-8395-8124fb080e82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0e7f9e0d-709d-40ea-bb38-80f3b9bd57d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.816 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce in datapath 7b412a37-c227-42ad-9fca-23287613486a bound to our chassis#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.817 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b412a37-c227-42ad-9fca-23287613486a#033[00m
Nov 29 02:46:15 np0005539504 systemd-udevd[247647]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.830 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9080aafe-0861-45d0-b4f2-a3a9da4d380c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.831 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b412a37-c1 in ovnmeta-7b412a37-c227-42ad-9fca-23287613486a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.834 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b412a37-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.835 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[279258d7-7eaf-4f9e-8aa8-86243cae6a4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.835 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0a49a01d-9ca5-4722-90cc-86ddb1fe9a3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 systemd-machined[153423]: New machine qemu-86-instance-000000a6.
Nov 29 02:46:15 np0005539504 NetworkManager[55210]: <info>  [1764402375.8438] device (tap6ee81ad6-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:46:15 np0005539504 NetworkManager[55210]: <info>  [1764402375.8452] device (tap6ee81ad6-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.851 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[465ae754-2bdf-4912-9366-c633d5cdab74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 systemd[1]: Started Virtual Machine qemu-86-instance-000000a6.
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.875 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:15Z|00673|binding|INFO|Setting lport 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce ovn-installed in OVS
Nov 29 02:46:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:15Z|00674|binding|INFO|Setting lport 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce up in Southbound
Nov 29 02:46:15 np0005539504 nova_compute[187152]: 2025-11-29 07:46:15.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.882 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[be04e43c-c5b7-4f4f-b7fa-734eaca71190]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.920 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[6a425f34-f01c-46a0-8eba-b603fd422b75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.925 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2e1638-64bb-4492-b52b-a2fdd2df77b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 NetworkManager[55210]: <info>  [1764402375.9262] manager: (tap7b412a37-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/300)
Nov 29 02:46:15 np0005539504 systemd-udevd[247650]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.957 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[95a1b2d7-5c80-4b69-8e2b-407646e5791b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.961 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[2495c6df-0f65-454a-a7ab-e2d7d67cad67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:15 np0005539504 NetworkManager[55210]: <info>  [1764402375.9851] device (tap7b412a37-c0): carrier: link connected
Nov 29 02:46:15 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:15.990 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0d13cc-21f7-4d13-89ad-d78562e3c6ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.006 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[11bc4525-5e1d-46f7-aaf8-340cb3a5fbfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b412a37-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:bd:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 783886, 'reachable_time': 36112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247679, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.027 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[22e73380-b3f8-4770-a944-965f3e5306ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee1:bd9f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 783886, 'tstamp': 783886}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247680, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.044 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f8e4e689-efde-4da4-83f0-69050cba88d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b412a37-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:bd:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 783886, 'reachable_time': 36112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247681, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.082 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[700ea3aa-dde1-45a0-934e-7b6f4029c645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.154 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b85b79b5-f2e3-401b-889a-667bda66dc9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.156 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b412a37-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.157 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.157 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b412a37-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:16 np0005539504 NetworkManager[55210]: <info>  [1764402376.1609] manager: (tap7b412a37-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Nov 29 02:46:16 np0005539504 kernel: tap7b412a37-c0: entered promiscuous mode
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.160 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.163 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b412a37-c0, col_values=(('external_ids', {'iface-id': '27074c74-d81e-4dc1-9e05-b59b6b9a0624'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:16 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:16Z|00675|binding|INFO|Releasing lport 27074c74-d81e-4dc1-9e05-b59b6b9a0624 from this chassis (sb_readonly=0)
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.168 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b412a37-c227-42ad-9fca-23287613486a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b412a37-c227-42ad-9fca-23287613486a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.169 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6c82c3d8-4f81-46b9-ad93-4ed97cee15c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.170 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-7b412a37-c227-42ad-9fca-23287613486a
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/7b412a37-c227-42ad-9fca-23287613486a.pid.haproxy
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 7b412a37-c227-42ad-9fca-23287613486a
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:46:16 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:16.172 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'env', 'PROCESS_TAG=haproxy-7b412a37-c227-42ad-9fca-23287613486a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b412a37-c227-42ad-9fca-23287613486a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.176 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.225 187156 DEBUG nova.compute.manager [req-64444c13-5d2f-499b-ac28-453c2f896497 req-f1cfd6da-941a-4f24-97ac-0f7be3af434b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.225 187156 DEBUG oslo_concurrency.lockutils [req-64444c13-5d2f-499b-ac28-453c2f896497 req-f1cfd6da-941a-4f24-97ac-0f7be3af434b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.226 187156 DEBUG oslo_concurrency.lockutils [req-64444c13-5d2f-499b-ac28-453c2f896497 req-f1cfd6da-941a-4f24-97ac-0f7be3af434b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.226 187156 DEBUG oslo_concurrency.lockutils [req-64444c13-5d2f-499b-ac28-453c2f896497 req-f1cfd6da-941a-4f24-97ac-0f7be3af434b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.226 187156 DEBUG nova.compute.manager [req-64444c13-5d2f-499b-ac28-453c2f896497 req-f1cfd6da-941a-4f24-97ac-0f7be3af434b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Processing event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.431 187156 DEBUG nova.network.neutron [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updated VIF entry in instance network info cache for port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.431 187156 DEBUG nova.network.neutron [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updating instance_info_cache with network_info: [{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.456 187156 DEBUG oslo_concurrency.lockutils [req-abc4cf06-b57b-42ed-a62b-f331e969e3d0 req-431de9fc-dddc-4448-8e3a-b21efa5636b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:46:16 np0005539504 podman[247715]: 2025-11-29 07:46:16.524282553 +0000 UTC m=+0.055074518 container create cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:46:16 np0005539504 systemd[1]: Started libpod-conmon-cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc.scope.
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.556 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.557 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402376.5551, 1252d970-730d-4caf-8395-8124fb080e82 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.557 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] VM Started (Lifecycle Event)#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.568 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.573 187156 INFO nova.virt.libvirt.driver [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Instance spawned successfully.#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.573 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:46:16 np0005539504 podman[247715]: 2025-11-29 07:46:16.492618464 +0000 UTC m=+0.023410449 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:46:16 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.593 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:46:16 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07cc466736b304272ea8cc51551eb7a5a281696c9d199f31533e68705f027b71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.600 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:46:16 np0005539504 podman[247715]: 2025-11-29 07:46:16.610008693 +0000 UTC m=+0.140800768 container init cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.611 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.612 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.612 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.613 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.613 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.614 187156 DEBUG nova.virt.libvirt.driver [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:46:16 np0005539504 podman[247715]: 2025-11-29 07:46:16.616321853 +0000 UTC m=+0.147113818 container start cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 02:46:16 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [NOTICE]   (247739) : New worker (247741) forked
Nov 29 02:46:16 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [NOTICE]   (247739) : Loading success.
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.641 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.642 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402376.555511, 1252d970-730d-4caf-8395-8124fb080e82 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.643 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.676 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.679 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402376.5619335, 1252d970-730d-4caf-8395-8124fb080e82 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.680 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.709 187156 INFO nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Took 15.99 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.714 187156 DEBUG nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.716 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.723 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.766 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.821 187156 INFO nova.compute.manager [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Took 36.23 seconds to build instance.#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.866 187156 DEBUG oslo_concurrency.lockutils [None req-faeeb8d8-1b8d-4c78-8ebe-8e8a5f529e00 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 36.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:16 np0005539504 nova_compute[187152]: 2025-11-29 07:46:16.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:17 np0005539504 nova_compute[187152]: 2025-11-29 07:46:17.908 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:18 np0005539504 nova_compute[187152]: 2025-11-29 07:46:18.364 187156 DEBUG nova.compute.manager [req-eda7513f-6db3-48a0-9bd9-c34edad9032b req-0c86387e-287a-4000-8f51-f886376c7010 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:18 np0005539504 nova_compute[187152]: 2025-11-29 07:46:18.365 187156 DEBUG oslo_concurrency.lockutils [req-eda7513f-6db3-48a0-9bd9-c34edad9032b req-0c86387e-287a-4000-8f51-f886376c7010 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:18 np0005539504 nova_compute[187152]: 2025-11-29 07:46:18.366 187156 DEBUG oslo_concurrency.lockutils [req-eda7513f-6db3-48a0-9bd9-c34edad9032b req-0c86387e-287a-4000-8f51-f886376c7010 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:18 np0005539504 nova_compute[187152]: 2025-11-29 07:46:18.366 187156 DEBUG oslo_concurrency.lockutils [req-eda7513f-6db3-48a0-9bd9-c34edad9032b req-0c86387e-287a-4000-8f51-f886376c7010 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:18 np0005539504 nova_compute[187152]: 2025-11-29 07:46:18.366 187156 DEBUG nova.compute.manager [req-eda7513f-6db3-48a0-9bd9-c34edad9032b req-0c86387e-287a-4000-8f51-f886376c7010 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] No waiting events found dispatching network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:46:18 np0005539504 nova_compute[187152]: 2025-11-29 07:46:18.366 187156 WARNING nova.compute.manager [req-eda7513f-6db3-48a0-9bd9-c34edad9032b req-0c86387e-287a-4000-8f51-f886376c7010 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received unexpected event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce for instance with vm_state active and task_state None.#033[00m
Nov 29 02:46:19 np0005539504 nova_compute[187152]: 2025-11-29 07:46:19.335 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:21 np0005539504 nova_compute[187152]: 2025-11-29 07:46:21.016 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:21 np0005539504 NetworkManager[55210]: <info>  [1764402381.0179] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Nov 29 02:46:21 np0005539504 NetworkManager[55210]: <info>  [1764402381.0189] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Nov 29 02:46:21 np0005539504 nova_compute[187152]: 2025-11-29 07:46:21.142 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:21Z|00676|binding|INFO|Releasing lport 27074c74-d81e-4dc1-9e05-b59b6b9a0624 from this chassis (sb_readonly=0)
Nov 29 02:46:21 np0005539504 nova_compute[187152]: 2025-11-29 07:46:21.158 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.106 187156 DEBUG nova.compute.manager [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-changed-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.108 187156 DEBUG nova.compute.manager [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Refreshing instance network info cache due to event network-changed-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.108 187156 DEBUG oslo_concurrency.lockutils [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.109 187156 DEBUG oslo_concurrency.lockutils [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.109 187156 DEBUG nova.network.neutron [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Refreshing network info cache for port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:46:22 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:22Z|00677|binding|INFO|Releasing lport 27074c74-d81e-4dc1-9e05-b59b6b9a0624 from this chassis (sb_readonly=0)
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:22 np0005539504 nova_compute[187152]: 2025-11-29 07:46:22.911 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:23.484 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:23.485 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:46:23.485 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:24 np0005539504 nova_compute[187152]: 2025-11-29 07:46:24.339 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:24 np0005539504 nova_compute[187152]: 2025-11-29 07:46:24.385 187156 DEBUG nova.network.neutron [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updated VIF entry in instance network info cache for port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:46:24 np0005539504 nova_compute[187152]: 2025-11-29 07:46:24.386 187156 DEBUG nova.network.neutron [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updating instance_info_cache with network_info: [{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:46:24 np0005539504 nova_compute[187152]: 2025-11-29 07:46:24.431 187156 DEBUG oslo_concurrency.lockutils [req-3523dba0-82ef-492f-9dcb-e05e69650f8e req-e5c20d52-c24c-495c-9ff9-bcbd746e7a80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:46:24 np0005539504 podman[247751]: 2025-11-29 07:46:24.745429137 +0000 UTC m=+0.067951984 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:46:24 np0005539504 podman[247752]: 2025-11-29 07:46:24.747248966 +0000 UTC m=+0.069776783 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:46:24 np0005539504 podman[247753]: 2025-11-29 07:46:24.773597743 +0000 UTC m=+0.095906265 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:46:25 np0005539504 nova_compute[187152]: 2025-11-29 07:46:25.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:27 np0005539504 nova_compute[187152]: 2025-11-29 07:46:27.914 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:27 np0005539504 nova_compute[187152]: 2025-11-29 07:46:27.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:27 np0005539504 nova_compute[187152]: 2025-11-29 07:46:27.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:46:28 np0005539504 podman[247825]: 2025-11-29 07:46:28.75969026 +0000 UTC m=+0.070911674 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:46:28 np0005539504 nova_compute[187152]: 2025-11-29 07:46:28.892 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:29 np0005539504 nova_compute[187152]: 2025-11-29 07:46:29.343 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:29 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:29Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:e7:88 10.100.0.12
Nov 29 02:46:29 np0005539504 ovn_controller[95182]: 2025-11-29T07:46:29Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:e7:88 10.100.0.12
Nov 29 02:46:29 np0005539504 podman[247851]: 2025-11-29 07:46:29.852616863 +0000 UTC m=+0.186494345 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:46:31 np0005539504 nova_compute[187152]: 2025-11-29 07:46:31.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.593 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.593 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.593 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.594 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.707 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.776 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.778 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.850 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:32 np0005539504 nova_compute[187152]: 2025-11-29 07:46:32.915 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.033 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.034 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5523MB free_disk=72.97777557373047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.034 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.035 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.114 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 1252d970-730d-4caf-8395-8124fb080e82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.114 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.116 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.133 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.149 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.150 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.166 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.207 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.310 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.327 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.400 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:46:33 np0005539504 nova_compute[187152]: 2025-11-29 07:46:33.400 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:34 np0005539504 nova_compute[187152]: 2025-11-29 07:46:34.383 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.402 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.403 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.403 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.721 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.722 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.722 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:46:36 np0005539504 nova_compute[187152]: 2025-11-29 07:46:36.722 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1252d970-730d-4caf-8395-8124fb080e82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:46:37 np0005539504 podman[247885]: 2025-11-29 07:46:37.71088307 +0000 UTC m=+0.057449742 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 02:46:37 np0005539504 nova_compute[187152]: 2025-11-29 07:46:37.917 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:38 np0005539504 nova_compute[187152]: 2025-11-29 07:46:38.195 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:39 np0005539504 nova_compute[187152]: 2025-11-29 07:46:39.387 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:42 np0005539504 nova_compute[187152]: 2025-11-29 07:46:42.201 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updating instance_info_cache with network_info: [{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:46:42 np0005539504 nova_compute[187152]: 2025-11-29 07:46:42.336 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:46:42 np0005539504 nova_compute[187152]: 2025-11-29 07:46:42.336 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:46:42 np0005539504 nova_compute[187152]: 2025-11-29 07:46:42.337 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:42 np0005539504 nova_compute[187152]: 2025-11-29 07:46:42.337 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:42 np0005539504 podman[247905]: 2025-11-29 07:46:42.722972625 +0000 UTC m=+0.064792700 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:46:42 np0005539504 nova_compute[187152]: 2025-11-29 07:46:42.918 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:44 np0005539504 nova_compute[187152]: 2025-11-29 07:46:44.389 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:44 np0005539504 nova_compute[187152]: 2025-11-29 07:46:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:47 np0005539504 nova_compute[187152]: 2025-11-29 07:46:47.921 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:47.991 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1252d970-730d-4caf-8395-8124fb080e82', 'name': 'tempest-TestGettingAddress-server-1708707331', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a6', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:47.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:47.996 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1252d970-730d-4caf-8395-8124fb080e82 / tap6ee81ad6-6e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:46:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:47.996 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.incoming.bytes volume: 4379 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44ebd62d-3796-4387-99f0-ed7c0af80218', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4379, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:47.992732', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f33322c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '561d78bf66cee381b644edd994040fa2029df8ba51cceba41194cdc5b7bb8187'}]}, 'timestamp': '2025-11-29 07:46:47.997584', '_unique_id': '81d8b33586a9443991b0e8eca74d52cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.000 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.017 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/cpu volume: 13640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a351b2c-61f9-4ec9-a0f8-9d0c52ac9dab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13640000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82', 'timestamp': '2025-11-29T07:46:48.003031', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8f3660c8-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.952210348, 'message_signature': '47fef3fc4139a8dcf79e8c0e161a359d350e922679a0e33312d9c6870c94c117'}]}, 'timestamp': '2025-11-29 07:46:48.018140', '_unique_id': '3c76a3e0566d4d798bdeb860220a3889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.019 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6078bf49-a11a-4361-8d53-15e069816be4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.019964', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f36b532-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '87a54354e3ad765233cdff00f70d8251bd1c5e7645837aa50fc904e61b3f2f93'}]}, 'timestamp': '2025-11-29 07:46:48.020227', '_unique_id': 'cf2badac5af4470582c177bc1f5d8fad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.020 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.031 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.031 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba3b2cae-5452-4bf4-b03f-d5b2dd45ca99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.021692', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f38791c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.95637291, 'message_signature': '3973a1ea2c5d8ee776b109ed34b1dfdcef2a7ff0ba9c607f329e729eab0789de'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.021692', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f388a2e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.95637291, 'message_signature': '97a19dbce93f6728b34b35d4ab671c511b49e68ee47ef071fe63ebe84500a644'}]}, 'timestamp': '2025-11-29 07:46:48.032309', '_unique_id': 'ab8cca2760734d2dafc562c3a040adbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.035 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f510cacd-ea30-484d-a65e-fce0dcdf9716', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.035086', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f3907b0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '435f8c3b6da2affc7cb525bb72b9810cc1211d119a76ea5226ac4bb4e2eaaba3'}]}, 'timestamp': '2025-11-29 07:46:48.035575', '_unique_id': '19c20ec285824a1eafefdc43e435ce05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.036 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.063 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.write.latency volume: 4337904798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.063 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9772b515-5519-4052-90f0-932a6370ae97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4337904798, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.037837', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3d5e0a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': 'c1a72601329a33857293b095d636532d6b20225a1606a029e235254589f676c7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.037837', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3d6b7a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '028774b5db9c58678cab32378aed0c2a68bba9d55a6fd629b12329a564b225c1'}]}, 'timestamp': '2025-11-29 07:46:48.064208', '_unique_id': '9e5f1f2b5a3e4d7a8cc30739827e2f6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.066 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.066 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4aa7a2e-fe03-4f75-8466-4da43f5ecc4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.066523', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3dd272-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '2efca3a4e3e7438d791dc85a84091315321c418c5d0d8d04f493b18ab3a76bd6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.066523', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3ddc4a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': 'adaf8f6c7e540bf168ad61e4aaa642e43b4c6ee4bf36670688eb263100213fdb'}]}, 'timestamp': '2025-11-29 07:46:48.067078', '_unique_id': 'c350237296ce4205b412cefa8627936a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.068 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.068 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.outgoing.bytes volume: 3704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1641778c-4356-479f-a44c-45db92fed2e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3704, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.068698', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f3e272c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '05d76e2013285d60533005e4d54d03c42195bace29539bcaa4f48e7c56bac61e'}]}, 'timestamp': '2025-11-29 07:46:48.069006', '_unique_id': '87ca72cf605444c6a45d328819669d91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.070 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.070 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>]
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.071 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.071 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>]
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.071 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f5bc82-7d58-4df7-ab5b-052f6576c4cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.071754', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3e9d6a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': 'a6f15495f7acca5f863de4ee277e969d93ea1816e5daaf3989f40a32d1fdb994'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.071754', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3ea706-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': 'da79fc07de13a0e1ea8edf8a795a3077da7b028da6e52e3644b237cea94aebe9'}]}, 'timestamp': '2025-11-29 07:46:48.072266', '_unique_id': '4c51375699e84c11b08a32af1d8d72a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.073 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f99f5f7e-06cb-4462-9cc5-0ac9d0f01365', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.073810', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f3eee1e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '4051e09f080f9d34ecf71f34d05712cf419bfd36db37d865f4153ea06033a0cf'}]}, 'timestamp': '2025-11-29 07:46:48.074099', '_unique_id': 'a3ece8d9a90b4091a4178a9de25041f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.075 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe429902-b1d3-4465-9527-2fda1597e7f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.075551', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f3f318a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': 'ea314cc5c1817a73c68f916225c3e5c9eadc3e8aceca0463cdb2bcb3efc52487'}]}, 'timestamp': '2025-11-29 07:46:48.075823', '_unique_id': '8f7ea19e72b5493b96acbe2887e7694a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.077 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.077 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acfc05d9-28b5-4148-81ca-3d1702bb8a47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.077270', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3f7596-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.95637291, 'message_signature': '9196120b7c3e06d1ca5eeb1ec705365f0a2ac761ad2c0f6f6226a8ce0aba5d34'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.077270', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3f7f5a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.95637291, 'message_signature': '0afabeb215d9645315dbaf1bbc37d83f15e590cdec45724629fa190b0cd19501'}]}, 'timestamp': '2025-11-29 07:46:48.077802', '_unique_id': '321806eb9bd94ae58ed7ae2c78e6dd7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.079 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36b46b61-73c2-4136-af60-e3ce70a5a482', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.079266', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f3fc370-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '8408248a035af9a3a26761331d01475e36e7a313bc169ff56e08ea7b5602314b'}]}, 'timestamp': '2025-11-29 07:46:48.079558', '_unique_id': '9873f9c22c574be4ae2e3cba8eba9465'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.incoming.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db6b6a07-0fe2-43e8-86ca-7fb92d038f35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.081006', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f4006b4-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '34d56c77fdc25311dba835020fd04fa14abb0514aa5ae6fab223864c8c337b9e'}]}, 'timestamp': '2025-11-29 07:46:48.081281', '_unique_id': '5ff9dce9bd0c409482bff1ec3f4761ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>]
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '270649cf-0699-4d64-a52a-11d9f8d3f35a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.083154', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f405a56-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '2da1b421e2ae8b41ea79ebad3d2c25e99eed7ff4325f3d0ef4f17fdcc264cf64'}]}, 'timestamp': '2025-11-29 07:46:48.083454', '_unique_id': '187459020e4441f799bf6b9830e29c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.084 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.read.requests volume: 1097 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1f85af0-e78f-4cf1-bc86-fd4ff4a51705', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1097, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.084935', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f409fe8-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': 'bf9eb57648a239407095ed42ccf5d6b1021ec0b58d0d3bc3b4f97a15d1d39054'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.084935', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f40a966-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '836cc8530f75d6c1fe974a86de8f811c37baa2a54ca015f771564f1a8ec70d24'}]}, 'timestamp': '2025-11-29 07:46:48.085457', '_unique_id': 'bd9e0017bb24442b933ed17c91490085'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/memory.usage volume: 42.7265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b26bee-df2e-4419-96e1-648ad46d0049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7265625, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82', 'timestamp': '2025-11-29T07:46:48.086965', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8f40f006-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.952210348, 'message_signature': 'c94dd02cb8f5fdd755499f2722128ea1cb9f557400a5b32e9134918e92f90380'}]}, 'timestamp': '2025-11-29 07:46:48.087246', '_unique_id': '6c1dcfc6b4f7429ab7956aa532f29f9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.088 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e026d082-e71d-45cc-b9bd-ee77dbcfdb86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.088729', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f413642-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.95637291, 'message_signature': '9b195c419b7b9ce31fc77bdad61290cf125c40fca4115dc4cf65a5b2acba5db9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.088729', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f413fa2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.95637291, 'message_signature': 'f573c4adc19053c8c64cff44c7040dd64833e82928de5adbe7990bdbe238b8a7'}]}, 'timestamp': '2025-11-29 07:46:48.089279', '_unique_id': '27b48c83501241e08d2cf5eed26a9e6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.091 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.write.bytes volume: 72929280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.091 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eba238d-d5be-4fb5-8a8c-47eb709f782e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72929280, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.091046', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f419358-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '46d7a780c06832ec19db50a323cf07062f6fcdb512116ee258e38d1b4f52386b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.091046', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f419ff6-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '29a8cd301b28426cb39a4df210d06990da423432fedca1092dbadb3724af8581'}]}, 'timestamp': '2025-11-29 07:46:48.091757', '_unique_id': 'b0db619d993a4e24a76fc1a7a9dcfa80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.093 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.093 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1708707331>]
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.093 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07cedb4f-6ecf-4136-9c5a-d7f2bc0e73ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a6-1252d970-730d-4caf-8395-8124fb080e82-tap6ee81ad6-6e', 'timestamp': '2025-11-29T07:46:48.093940', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'tap6ee81ad6-6e', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:e7:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ee81ad6-6e'}, 'message_id': '8f41fe88-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.927497075, 'message_signature': '0df042d7c1262e81668a1e375feb4f83013c348f6239b0e08f709ac5896b9e06'}]}, 'timestamp': '2025-11-29 07:46:48.094180', '_unique_id': 'ea58911c03ec4f44bc3fbfcef5d8dabf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.095 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.read.latency volume: 243456216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.095 12 DEBUG ceilometer.compute.pollsters [-] 1252d970-730d-4caf-8395-8124fb080e82/disk.device.read.latency volume: 24839258 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79a12012-1da3-4ae2-bf14-1d40ec87d6f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 243456216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-vda', 'timestamp': '2025-11-29T07:46:48.095273', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f4232e0-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '2188714825368275e58c563ead25aea832103b7dcbd765bae77c001de2fa390d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24839258, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1252d970-730d-4caf-8395-8124fb080e82-sda', 'timestamp': '2025-11-29T07:46:48.095273', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1708707331', 'name': 'instance-000000a6', 'instance_id': '1252d970-730d-4caf-8395-8124fb080e82', 'instance_type': 'm1.nano', 'host': 'b2e90c6c36f7566ef9b632632d266ff37062be3bb58e2bafcd772a9b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f423b00-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7870.972557475, 'message_signature': '6a789dbd9974377a9b7ae9bb9607254fc2a6a4f95f8de6fe401b36fba819d7cb'}]}, 'timestamp': '2025-11-29 07:46:48.095713', '_unique_id': '94a866e333a145aba541a21223ac0209'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:46:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:46:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:46:49 np0005539504 nova_compute[187152]: 2025-11-29 07:46:49.391 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:49 np0005539504 nova_compute[187152]: 2025-11-29 07:46:49.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:52 np0005539504 nova_compute[187152]: 2025-11-29 07:46:52.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:54 np0005539504 nova_compute[187152]: 2025-11-29 07:46:54.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:55 np0005539504 podman[247933]: 2025-11-29 07:46:55.733325201 +0000 UTC m=+0.070722279 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:46:55 np0005539504 podman[247934]: 2025-11-29 07:46:55.746896354 +0000 UTC m=+0.073170763 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:46:55 np0005539504 podman[247935]: 2025-11-29 07:46:55.75305406 +0000 UTC m=+0.079027381 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:46:57 np0005539504 nova_compute[187152]: 2025-11-29 07:46:57.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:59 np0005539504 nova_compute[187152]: 2025-11-29 07:46:59.446 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:59 np0005539504 podman[247995]: 2025-11-29 07:46:59.711891698 +0000 UTC m=+0.053175728 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:47:00 np0005539504 podman[248019]: 2025-11-29 07:47:00.788580735 +0000 UTC m=+0.122946789 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:47:02 np0005539504 nova_compute[187152]: 2025-11-29 07:47:02.926 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:04 np0005539504 nova_compute[187152]: 2025-11-29 07:47:04.448 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:07 np0005539504 nova_compute[187152]: 2025-11-29 07:47:07.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:08 np0005539504 podman[248049]: 2025-11-29 07:47:08.764250384 +0000 UTC m=+0.100332703 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:47:09 np0005539504 nova_compute[187152]: 2025-11-29 07:47:09.452 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:12 np0005539504 nova_compute[187152]: 2025-11-29 07:47:12.930 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:13 np0005539504 podman[248070]: 2025-11-29 07:47:13.746664961 +0000 UTC m=+0.079482523 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:47:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:14.005 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:47:14 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:14.006 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:47:14 np0005539504 nova_compute[187152]: 2025-11-29 07:47:14.010 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:14 np0005539504 nova_compute[187152]: 2025-11-29 07:47:14.455 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:15 np0005539504 nova_compute[187152]: 2025-11-29 07:47:15.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:17 np0005539504 nova_compute[187152]: 2025-11-29 07:47:17.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:18 np0005539504 nova_compute[187152]: 2025-11-29 07:47:18.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:19 np0005539504 nova_compute[187152]: 2025-11-29 07:47:19.458 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:22 np0005539504 nova_compute[187152]: 2025-11-29 07:47:22.934 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:23.484 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:23.485 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:23.486 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:24.008 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:24 np0005539504 nova_compute[187152]: 2025-11-29 07:47:24.462 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:25 np0005539504 nova_compute[187152]: 2025-11-29 07:47:25.942 187156 DEBUG nova.compute.manager [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-changed-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:25 np0005539504 nova_compute[187152]: 2025-11-29 07:47:25.942 187156 DEBUG nova.compute.manager [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Refreshing instance network info cache due to event network-changed-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:47:25 np0005539504 nova_compute[187152]: 2025-11-29 07:47:25.943 187156 DEBUG oslo_concurrency.lockutils [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:47:25 np0005539504 nova_compute[187152]: 2025-11-29 07:47:25.944 187156 DEBUG oslo_concurrency.lockutils [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:47:25 np0005539504 nova_compute[187152]: 2025-11-29 07:47:25.944 187156 DEBUG nova.network.neutron [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Refreshing network info cache for port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.037 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.038 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.039 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.039 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.039 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.055 187156 INFO nova.compute.manager [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Terminating instance#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.071 187156 DEBUG nova.compute.manager [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:47:26 np0005539504 kernel: tap6ee81ad6-6e (unregistering): left promiscuous mode
Nov 29 02:47:26 np0005539504 NetworkManager[55210]: <info>  [1764402446.0962] device (tap6ee81ad6-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:47:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:47:26Z|00678|binding|INFO|Releasing lport 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce from this chassis (sb_readonly=0)
Nov 29 02:47:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:47:26Z|00679|binding|INFO|Setting lport 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce down in Southbound
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.104 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 ovn_controller[95182]: 2025-11-29T07:47:26Z|00680|binding|INFO|Removing iface tap6ee81ad6-6e ovn-installed in OVS
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.107 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.114 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:e7:88 10.100.0.12 2001:db8::f816:3eff:fe9a:e788'], port_security=['fa:16:3e:9a:e7:88 10.100.0.12 2001:db8::f816:3eff:fe9a:e788'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8::f816:3eff:fe9a:e788/64', 'neutron:device_id': '1252d970-730d-4caf-8395-8124fb080e82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0e7f9e0d-709d-40ea-bb38-80f3b9bd57d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.116 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce in datapath 7b412a37-c227-42ad-9fca-23287613486a unbound from our chassis#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.117 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b412a37-c227-42ad-9fca-23287613486a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.119 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[02b4d90d-fcf7-442a-9f25-a0221751ca7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.119 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b412a37-c227-42ad-9fca-23287613486a namespace which is not needed anymore#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.120 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Nov 29 02:47:26 np0005539504 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a6.scope: Consumed 17.677s CPU time.
Nov 29 02:47:26 np0005539504 systemd-machined[153423]: Machine qemu-86-instance-000000a6 terminated.
Nov 29 02:47:26 np0005539504 podman[248094]: 2025-11-29 07:47:26.186863464 +0000 UTC m=+0.057843232 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:47:26 np0005539504 podman[248096]: 2025-11-29 07:47:26.195306721 +0000 UTC m=+0.062548830 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:47:26 np0005539504 podman[248097]: 2025-11-29 07:47:26.195774623 +0000 UTC m=+0.059407224 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 02:47:26 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [NOTICE]   (247739) : haproxy version is 2.8.14-c23fe91
Nov 29 02:47:26 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [NOTICE]   (247739) : path to executable is /usr/sbin/haproxy
Nov 29 02:47:26 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [WARNING]  (247739) : Exiting Master process...
Nov 29 02:47:26 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [WARNING]  (247739) : Exiting Master process...
Nov 29 02:47:26 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [ALERT]    (247739) : Current worker (247741) exited with code 143 (Terminated)
Nov 29 02:47:26 np0005539504 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[247735]: [WARNING]  (247739) : All workers exited. Exiting... (0)
Nov 29 02:47:26 np0005539504 systemd[1]: libpod-cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc.scope: Deactivated successfully.
Nov 29 02:47:26 np0005539504 podman[248174]: 2025-11-29 07:47:26.268755092 +0000 UTC m=+0.044051494 container died cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:47:26 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc-userdata-shm.mount: Deactivated successfully.
Nov 29 02:47:26 np0005539504 systemd[1]: var-lib-containers-storage-overlay-07cc466736b304272ea8cc51551eb7a5a281696c9d199f31533e68705f027b71-merged.mount: Deactivated successfully.
Nov 29 02:47:26 np0005539504 podman[248174]: 2025-11-29 07:47:26.321831665 +0000 UTC m=+0.097128067 container cleanup cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:47:26 np0005539504 systemd[1]: libpod-conmon-cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc.scope: Deactivated successfully.
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.345 187156 INFO nova.virt.libvirt.driver [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Instance destroyed successfully.#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.346 187156 DEBUG nova.objects.instance [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 1252d970-730d-4caf-8395-8124fb080e82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:47:26 np0005539504 podman[248217]: 2025-11-29 07:47:26.387363333 +0000 UTC m=+0.042348826 container remove cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.393 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4d3f3e2b-7d85-40bf-8cd2-1c2bf4a3efca]: (4, ('Sat Nov 29 07:47:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a (cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc)\ncba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc\nSat Nov 29 07:47:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a (cba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc)\ncba0ef8d931ea2294902c5e4fe491cff46a3c7f16935caa4a6a2e76cb097ebdc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.395 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[35559733-f3b9-436b-a1b1-4ef35a5be94d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.396 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b412a37-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.424 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 kernel: tap7b412a37-c0: left promiscuous mode
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.440 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.444 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f0f695-f1f9-46e7-a32a-75a3ef72f412]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.466 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d96d6a5a-1573-4da5-b227-1f726969d881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.468 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[159c52a3-d915-415f-bdc5-f5d328701f1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.482 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[569534cd-d01d-4c41-9f34-93fdfa40e112]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 783879, 'reachable_time': 41262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248236, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.485 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b412a37-c227-42ad-9fca-23287613486a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:47:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:47:26.485 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[d8da1d22-501b-412e-8ef6-fb1c24b44cae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:26 np0005539504 systemd[1]: run-netns-ovnmeta\x2d7b412a37\x2dc227\x2d42ad\x2d9fca\x2d23287613486a.mount: Deactivated successfully.
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.814 187156 DEBUG nova.virt.libvirt.vif [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:45:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1708707331',display_name='tempest-TestGettingAddress-server-1708707331',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1708707331',id=166,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRqMqB2m2OgDWFFhqrQomXOlqtsH3DkZX/q3f9H1IQ0ObpMW22Tv9hUlgFTK1dOU/3/nBcrYC6MrtFKRuaytFioBJb/QmB1UkysPgqPE38bvZ1GGYFs/tP1vPRobYVgdQ==',key_name='tempest-TestGettingAddress-277961909',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:46:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-t1tym0mr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:46:16Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=1252d970-730d-4caf-8395-8124fb080e82,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.815 187156 DEBUG nova.network.os_vif_util [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.817 187156 DEBUG nova.network.os_vif_util [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.818 187156 DEBUG os_vif [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.821 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.821 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ee81ad6-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.824 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.826 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.831 187156 INFO os_vif [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:e7:88,bridge_name='br-int',has_traffic_filtering=True,id=6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee81ad6-6e')#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.832 187156 INFO nova.virt.libvirt.driver [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Deleting instance files /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82_del#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.834 187156 INFO nova.virt.libvirt.driver [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Deletion of /var/lib/nova/instances/1252d970-730d-4caf-8395-8124fb080e82_del complete#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.947 187156 INFO nova.compute.manager [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Took 0.88 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.949 187156 DEBUG oslo.service.loopingcall [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.949 187156 DEBUG nova.compute.manager [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:47:26 np0005539504 nova_compute[187152]: 2025-11-29 07:47:26.950 187156 DEBUG nova.network.neutron [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:47:27 np0005539504 nova_compute[187152]: 2025-11-29 07:47:27.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:27 np0005539504 nova_compute[187152]: 2025-11-29 07:47:27.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:27 np0005539504 nova_compute[187152]: 2025-11-29 07:47:27.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:47:27 np0005539504 nova_compute[187152]: 2025-11-29 07:47:27.937 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:28 np0005539504 nova_compute[187152]: 2025-11-29 07:47:28.856 187156 DEBUG nova.compute.manager [req-c7aa0aa1-cf4d-4318-a909-8a2c450045b7 req-6767fb02-77c3-43be-a8dc-3c2aaeab97af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-vif-unplugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:28 np0005539504 nova_compute[187152]: 2025-11-29 07:47:28.857 187156 DEBUG oslo_concurrency.lockutils [req-c7aa0aa1-cf4d-4318-a909-8a2c450045b7 req-6767fb02-77c3-43be-a8dc-3c2aaeab97af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:28 np0005539504 nova_compute[187152]: 2025-11-29 07:47:28.857 187156 DEBUG oslo_concurrency.lockutils [req-c7aa0aa1-cf4d-4318-a909-8a2c450045b7 req-6767fb02-77c3-43be-a8dc-3c2aaeab97af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:28 np0005539504 nova_compute[187152]: 2025-11-29 07:47:28.857 187156 DEBUG oslo_concurrency.lockutils [req-c7aa0aa1-cf4d-4318-a909-8a2c450045b7 req-6767fb02-77c3-43be-a8dc-3c2aaeab97af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:28 np0005539504 nova_compute[187152]: 2025-11-29 07:47:28.858 187156 DEBUG nova.compute.manager [req-c7aa0aa1-cf4d-4318-a909-8a2c450045b7 req-6767fb02-77c3-43be-a8dc-3c2aaeab97af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] No waiting events found dispatching network-vif-unplugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:47:28 np0005539504 nova_compute[187152]: 2025-11-29 07:47:28.858 187156 DEBUG nova.compute.manager [req-c7aa0aa1-cf4d-4318-a909-8a2c450045b7 req-6767fb02-77c3-43be-a8dc-3c2aaeab97af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-vif-unplugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.047 187156 DEBUG nova.network.neutron [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.066 187156 INFO nova.compute.manager [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Took 3.12 seconds to deallocate network for instance.#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.162 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.162 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.244 187156 DEBUG nova.compute.provider_tree [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.260 187156 DEBUG nova.scheduler.client.report [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.280 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.303 187156 INFO nova.scheduler.client.report [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 1252d970-730d-4caf-8395-8124fb080e82#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.383 187156 DEBUG oslo_concurrency.lockutils [None req-ab2b1361-493f-4c28-922e-8f504f1b6e3f 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.459 187156 DEBUG nova.compute.manager [req-ef51516f-29c6-46f5-83db-1726e66f8145 req-bada918c-6d03-4862-a977-9940ba418300 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-vif-deleted-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.618 187156 DEBUG nova.network.neutron [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updated VIF entry in instance network info cache for port 6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.619 187156 DEBUG nova.network.neutron [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Updating instance_info_cache with network_info: [{"id": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "address": "fa:16:3e:9a:e7:88", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9a:e788", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee81ad6-6e", "ovs_interfaceid": "6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.640 187156 DEBUG oslo_concurrency.lockutils [req-27f17110-086b-4748-bdba-6d18febb0ec4 req-09711a28-1418-4923-a84b-bc9a8830b948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1252d970-730d-4caf-8395-8124fb080e82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:47:30 np0005539504 podman[248237]: 2025-11-29 07:47:30.747742673 +0000 UTC m=+0.086906043 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.979 187156 DEBUG nova.compute.manager [req-b2e43845-18de-41aa-aabb-6cc4b67b3281 req-0fd43464-4b1d-4bd0-8397-7cbf287f5a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.980 187156 DEBUG oslo_concurrency.lockutils [req-b2e43845-18de-41aa-aabb-6cc4b67b3281 req-0fd43464-4b1d-4bd0-8397-7cbf287f5a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1252d970-730d-4caf-8395-8124fb080e82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.980 187156 DEBUG oslo_concurrency.lockutils [req-b2e43845-18de-41aa-aabb-6cc4b67b3281 req-0fd43464-4b1d-4bd0-8397-7cbf287f5a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.981 187156 DEBUG oslo_concurrency.lockutils [req-b2e43845-18de-41aa-aabb-6cc4b67b3281 req-0fd43464-4b1d-4bd0-8397-7cbf287f5a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1252d970-730d-4caf-8395-8124fb080e82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.981 187156 DEBUG nova.compute.manager [req-b2e43845-18de-41aa-aabb-6cc4b67b3281 req-0fd43464-4b1d-4bd0-8397-7cbf287f5a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] No waiting events found dispatching network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:47:30 np0005539504 nova_compute[187152]: 2025-11-29 07:47:30.981 187156 WARNING nova.compute.manager [req-b2e43845-18de-41aa-aabb-6cc4b67b3281 req-0fd43464-4b1d-4bd0-8397-7cbf287f5a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Received unexpected event network-vif-plugged-6ee81ad6-6e4d-4e9b-98a4-1eedbe900fce for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:47:31 np0005539504 podman[248262]: 2025-11-29 07:47:31.793342406 +0000 UTC m=+0.123751941 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:47:31 np0005539504 nova_compute[187152]: 2025-11-29 07:47:31.824 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:31 np0005539504 nova_compute[187152]: 2025-11-29 07:47:31.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:31 np0005539504 nova_compute[187152]: 2025-11-29 07:47:31.958 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:31 np0005539504 nova_compute[187152]: 2025-11-29 07:47:31.958 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:31 np0005539504 nova_compute[187152]: 2025-11-29 07:47:31.959 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:31 np0005539504 nova_compute[187152]: 2025-11-29 07:47:31.959 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.093 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.094 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5681MB free_disk=73.00645065307617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.095 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.095 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.163 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.164 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.182 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.195 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.220 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.220 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:32 np0005539504 nova_compute[187152]: 2025-11-29 07:47:32.943 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:35 np0005539504 nova_compute[187152]: 2025-11-29 07:47:35.887 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.111 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.221 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.222 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.222 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.250 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.827 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:36 np0005539504 nova_compute[187152]: 2025-11-29 07:47:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:37 np0005539504 nova_compute[187152]: 2025-11-29 07:47:37.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:37 np0005539504 nova_compute[187152]: 2025-11-29 07:47:37.952 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:39 np0005539504 podman[248290]: 2025-11-29 07:47:39.729271017 +0000 UTC m=+0.072810724 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:47:40 np0005539504 nova_compute[187152]: 2025-11-29 07:47:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:41 np0005539504 nova_compute[187152]: 2025-11-29 07:47:41.344 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402446.3427823, 1252d970-730d-4caf-8395-8124fb080e82 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:47:41 np0005539504 nova_compute[187152]: 2025-11-29 07:47:41.345 187156 INFO nova.compute.manager [-] [instance: 1252d970-730d-4caf-8395-8124fb080e82] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:47:41 np0005539504 nova_compute[187152]: 2025-11-29 07:47:41.384 187156 DEBUG nova.compute.manager [None req-114f175a-1e01-467b-83ca-3b5f05aacc6f - - - - - -] [instance: 1252d970-730d-4caf-8395-8124fb080e82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:47:41 np0005539504 nova_compute[187152]: 2025-11-29 07:47:41.828 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:42 np0005539504 nova_compute[187152]: 2025-11-29 07:47:42.954 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:44 np0005539504 podman[248311]: 2025-11-29 07:47:44.711495281 +0000 UTC m=+0.056380294 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:47:46 np0005539504 nova_compute[187152]: 2025-11-29 07:47:46.055 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:46 np0005539504 nova_compute[187152]: 2025-11-29 07:47:46.830 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:47 np0005539504 nova_compute[187152]: 2025-11-29 07:47:47.956 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:51 np0005539504 nova_compute[187152]: 2025-11-29 07:47:51.832 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:52 np0005539504 nova_compute[187152]: 2025-11-29 07:47:52.958 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:56 np0005539504 podman[248332]: 2025-11-29 07:47:56.716614119 +0000 UTC m=+0.057989087 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:47:56 np0005539504 podman[248333]: 2025-11-29 07:47:56.723819822 +0000 UTC m=+0.060600686 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:47:56 np0005539504 podman[248334]: 2025-11-29 07:47:56.748112334 +0000 UTC m=+0.080391598 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:47:56 np0005539504 nova_compute[187152]: 2025-11-29 07:47:56.834 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:57 np0005539504 nova_compute[187152]: 2025-11-29 07:47:57.960 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:00 np0005539504 nova_compute[187152]: 2025-11-29 07:48:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:00 np0005539504 nova_compute[187152]: 2025-11-29 07:48:00.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:48:01 np0005539504 podman[248394]: 2025-11-29 07:48:01.69917616 +0000 UTC m=+0.048771548 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:48:01 np0005539504 nova_compute[187152]: 2025-11-29 07:48:01.836 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:02 np0005539504 podman[248418]: 2025-11-29 07:48:02.732487544 +0000 UTC m=+0.075368002 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:48:02 np0005539504 nova_compute[187152]: 2025-11-29 07:48:02.962 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:06 np0005539504 nova_compute[187152]: 2025-11-29 07:48:06.838 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:07 np0005539504 nova_compute[187152]: 2025-11-29 07:48:07.997 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:09 np0005539504 podman[248449]: 2025-11-29 07:48:09.954636625 +0000 UTC m=+0.080361657 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:48:10 np0005539504 nova_compute[187152]: 2025-11-29 07:48:10.954 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:10 np0005539504 nova_compute[187152]: 2025-11-29 07:48:10.955 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:48:10 np0005539504 nova_compute[187152]: 2025-11-29 07:48:10.975 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:48:11 np0005539504 nova_compute[187152]: 2025-11-29 07:48:11.840 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.000 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.466 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.466 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.490 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.607 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.607 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.616 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.617 187156 INFO nova.compute.claims [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.775 187156 DEBUG nova.compute.provider_tree [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.795 187156 DEBUG nova.scheduler.client.report [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.824 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.825 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.885 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.886 187156 DEBUG nova.network.neutron [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.905 187156 INFO nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:48:13 np0005539504 nova_compute[187152]: 2025-11-29 07:48:13.925 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.138 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.140 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.141 187156 INFO nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Creating image(s)#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.142 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.142 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.143 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.163 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.251 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.252 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.253 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.264 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.287 187156 DEBUG nova.policy [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.324 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.325 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.660 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk 1073741824" returned: 0 in 0.335s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.662 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.662 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.720 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.721 187156 DEBUG nova.virt.disk.api [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.721 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.775 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.776 187156 DEBUG nova.virt.disk.api [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.776 187156 DEBUG nova.objects.instance [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid f0ee4a50-ced9-4062-8d91-0296cc909668 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.791 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.791 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Ensure instance console log exists: /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.792 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.792 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:14 np0005539504 nova_compute[187152]: 2025-11-29 07:48:14.792 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:15 np0005539504 nova_compute[187152]: 2025-11-29 07:48:15.406 187156 DEBUG nova.network.neutron [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Successfully created port: 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:48:15 np0005539504 podman[248484]: 2025-11-29 07:48:15.708201663 +0000 UTC m=+0.056201758 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.467 187156 DEBUG nova.network.neutron [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Successfully updated port: 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.505 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.505 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.506 187156 DEBUG nova.network.neutron [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.613 187156 DEBUG nova.compute.manager [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.614 187156 DEBUG nova.compute.manager [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing instance network info cache due to event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.614 187156 DEBUG oslo_concurrency.lockutils [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.783 187156 DEBUG nova.network.neutron [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:48:16 np0005539504 nova_compute[187152]: 2025-11-29 07:48:16.842 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:17 np0005539504 nova_compute[187152]: 2025-11-29 07:48:17.818 187156 DEBUG nova.network.neutron [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:17 np0005539504 nova_compute[187152]: 2025-11-29 07:48:17.957 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.003 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.733 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.735 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Instance network_info: |[{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.737 187156 DEBUG oslo_concurrency.lockutils [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.738 187156 DEBUG nova.network.neutron [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.743 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Start _get_guest_xml network_info=[{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.751 187156 WARNING nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.758 187156 DEBUG nova.virt.libvirt.host [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.759 187156 DEBUG nova.virt.libvirt.host [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.766 187156 DEBUG nova.virt.libvirt.host [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.766 187156 DEBUG nova.virt.libvirt.host [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.768 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.768 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.768 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.769 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.769 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.769 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.769 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.769 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.770 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.770 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.770 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.770 187156 DEBUG nova.virt.hardware [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.775 187156 DEBUG nova.virt.libvirt.vif [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1803258578',display_name='tempest-TestNetworkBasicOps-server-1803258578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1803258578',id=169,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQFgudm9Fpu3rBil/k2VkBHgHTj3fslbCsk41lFkMhvsi/C//pKxMH46aCINXEl/8zMWUm+/D3mt2EQV5xHuzB03FYfe3kSmhBufzsVePZca2PKydsIMvRpwaWeJ1Ka3w==',key_name='tempest-TestNetworkBasicOps-1348264139',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-l0bda32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:13Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f0ee4a50-ced9-4062-8d91-0296cc909668,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.775 187156 DEBUG nova.network.os_vif_util [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.776 187156 DEBUG nova.network.os_vif_util [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.777 187156 DEBUG nova.objects.instance [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid f0ee4a50-ced9-4062-8d91-0296cc909668 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.795 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <uuid>f0ee4a50-ced9-4062-8d91-0296cc909668</uuid>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <name>instance-000000a9</name>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkBasicOps-server-1803258578</nova:name>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:48:18</nova:creationTime>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        <nova:port uuid="9f78664f-38e8-4b5a-88ff-6a8cfef3f939">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <entry name="serial">f0ee4a50-ced9-4062-8d91-0296cc909668</entry>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <entry name="uuid">f0ee4a50-ced9-4062-8d91-0296cc909668</entry>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.config"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:95:d5:42"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <target dev="tap9f78664f-38"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/console.log" append="off"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:48:18 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:48:18 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:48:18 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:48:18 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.797 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Preparing to wait for external event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.797 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.797 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.798 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.798 187156 DEBUG nova.virt.libvirt.vif [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1803258578',display_name='tempest-TestNetworkBasicOps-server-1803258578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1803258578',id=169,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQFgudm9Fpu3rBil/k2VkBHgHTj3fslbCsk41lFkMhvsi/C//pKxMH46aCINXEl/8zMWUm+/D3mt2EQV5xHuzB03FYfe3kSmhBufzsVePZca2PKydsIMvRpwaWeJ1Ka3w==',key_name='tempest-TestNetworkBasicOps-1348264139',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-l0bda32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:13Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f0ee4a50-ced9-4062-8d91-0296cc909668,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.798 187156 DEBUG nova.network.os_vif_util [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.799 187156 DEBUG nova.network.os_vif_util [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.799 187156 DEBUG os_vif [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.800 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.800 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.801 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.805 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.805 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f78664f-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.806 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9f78664f-38, col_values=(('external_ids', {'iface-id': '9f78664f-38e8-4b5a-88ff-6a8cfef3f939', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:d5:42', 'vm-uuid': 'f0ee4a50-ced9-4062-8d91-0296cc909668'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.807 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:18 np0005539504 NetworkManager[55210]: <info>  [1764402498.8088] manager: (tap9f78664f-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.810 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.815 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.816 187156 INFO os_vif [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38')#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.955 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.955 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.955 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:95:d5:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:48:18 np0005539504 nova_compute[187152]: 2025-11-29 07:48:18.956 187156 INFO nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Using config drive#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.136 187156 INFO nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Creating config drive at /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.config#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.141 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnbihqiof execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.270 187156 DEBUG oslo_concurrency.processutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnbihqiof" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:20 np0005539504 kernel: tap9f78664f-38: entered promiscuous mode
Nov 29 02:48:20 np0005539504 NetworkManager[55210]: <info>  [1764402500.3356] manager: (tap9f78664f-38): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Nov 29 02:48:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:20Z|00681|binding|INFO|Claiming lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for this chassis.
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.337 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:20Z|00682|binding|INFO|9f78664f-38e8-4b5a-88ff-6a8cfef3f939: Claiming fa:16:3e:95:d5:42 10.100.0.7
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.343 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.350 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:42 10.100.0.7'], port_security=['fa:16:3e:95:d5:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55ec2953-d749-48e9-8cb3-cdbf368690fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8074c60c-bc9e-40bb-8493-fc40fe113e9f, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=9f78664f-38e8-4b5a-88ff-6a8cfef3f939) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.351 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 in datapath cfd1ce3c-e516-46ef-8712-573fe4de8313 bound to our chassis#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.352 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfd1ce3c-e516-46ef-8712-573fe4de8313#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.364 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1d023d67-87fb-42bb-83cf-48096b203888]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.365 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcfd1ce3c-e1 in ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:48:20 np0005539504 systemd-udevd[248528]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.367 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcfd1ce3c-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.367 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ec39f0bf-add7-4e60-ab05-1dd2821493c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.368 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ebe327-74d9-4c48-ab1c-b0ce3d2e6e73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 NetworkManager[55210]: <info>  [1764402500.3790] device (tap9f78664f-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:48:20 np0005539504 NetworkManager[55210]: <info>  [1764402500.3798] device (tap9f78664f-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.381 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[413dc484-1958-4479-9131-60ab0c81c03e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 systemd-machined[153423]: New machine qemu-87-instance-000000a9.
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 systemd[1]: Started Virtual Machine qemu-87-instance-000000a9.
Nov 29 02:48:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:20Z|00683|binding|INFO|Setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 ovn-installed in OVS
Nov 29 02:48:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:20Z|00684|binding|INFO|Setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 up in Southbound
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.401 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.405 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[38347e54-7d9c-495c-a9af-e295ee6fabc5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.435 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5e8188-a522-42eb-b5a4-bcbb6739093c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.444 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[6b9573d3-66e8-4315-98ca-51671dff8637]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 NetworkManager[55210]: <info>  [1764402500.4454] manager: (tapcfd1ce3c-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.477 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[dd01da37-5010-4f6c-bd7e-f95f04ba2ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.481 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[45760ef1-a1da-48f0-9bd9-fd961ecc7d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 NetworkManager[55210]: <info>  [1764402500.5033] device (tapcfd1ce3c-e0): carrier: link connected
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.508 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[ee378aa4-4cd8-4058-8c45-b8ef4c4d8b3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.525 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[13c3d764-b308-4839-848a-b31bfc4f5744]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfd1ce3c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:05:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796338, 'reachable_time': 20450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248562, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.541 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c0fc6022-e1cb-44f0-9913-a82a04dbe8c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:506'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 796338, 'tstamp': 796338}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248563, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.558 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[956bf9f9-bb0a-44ef-bfbb-1d8b4cf301f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfd1ce3c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:05:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 206], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796338, 'reachable_time': 20450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248564, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.585 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[980e8c2e-b973-4c1a-bad0-932318f58567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.640 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[568a589c-135f-4375-8f20-859bee21e568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.644 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfd1ce3c-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.644 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.645 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfd1ce3c-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.647 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 NetworkManager[55210]: <info>  [1764402500.6478] manager: (tapcfd1ce3c-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Nov 29 02:48:20 np0005539504 kernel: tapcfd1ce3c-e0: entered promiscuous mode
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.650 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfd1ce3c-e0, col_values=(('external_ids', {'iface-id': '7f4b3b3b-6ee7-4970-9e8f-3e592045a366'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:20Z|00685|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.651 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.652 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cfd1ce3c-e516-46ef-8712-573fe4de8313.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cfd1ce3c-e516-46ef-8712-573fe4de8313.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.653 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4dcfaf35-17d0-4cb9-b04b-1a1ac8905cdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.654 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-cfd1ce3c-e516-46ef-8712-573fe4de8313
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/cfd1ce3c-e516-46ef-8712-573fe4de8313.pid.haproxy
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID cfd1ce3c-e516-46ef-8712-573fe4de8313
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:48:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:20.655 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'env', 'PROCESS_TAG=haproxy-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cfd1ce3c-e516-46ef-8712-573fe4de8313.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.663 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.665 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402500.6638503, f0ee4a50-ced9-4062-8d91-0296cc909668 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.665 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] VM Started (Lifecycle Event)#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.687 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.691 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402500.6642332, f0ee4a50-ced9-4062-8d91-0296cc909668 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.692 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.695 187156 DEBUG nova.compute.manager [req-a6216604-4457-4491-b102-7c68553c9c43 req-6ba65d3f-13af-4623-a5f6-fa2897953017 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.696 187156 DEBUG oslo_concurrency.lockutils [req-a6216604-4457-4491-b102-7c68553c9c43 req-6ba65d3f-13af-4623-a5f6-fa2897953017 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.696 187156 DEBUG oslo_concurrency.lockutils [req-a6216604-4457-4491-b102-7c68553c9c43 req-6ba65d3f-13af-4623-a5f6-fa2897953017 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.697 187156 DEBUG oslo_concurrency.lockutils [req-a6216604-4457-4491-b102-7c68553c9c43 req-6ba65d3f-13af-4623-a5f6-fa2897953017 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.697 187156 DEBUG nova.compute.manager [req-a6216604-4457-4491-b102-7c68553c9c43 req-6ba65d3f-13af-4623-a5f6-fa2897953017 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Processing event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.698 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.701 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.705 187156 INFO nova.virt.libvirt.driver [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Instance spawned successfully.#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.706 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.716 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.719 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402500.7006888, f0ee4a50-ced9-4062-8d91-0296cc909668 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.719 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.730 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.731 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.731 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.732 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.733 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.733 187156 DEBUG nova.virt.libvirt.driver [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.770 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.774 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.808 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.833 187156 INFO nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Took 6.69 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.834 187156 DEBUG nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:20 np0005539504 nova_compute[187152]: 2025-11-29 07:48:20.965 187156 INFO nova.compute.manager [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Took 7.41 seconds to build instance.#033[00m
Nov 29 02:48:21 np0005539504 nova_compute[187152]: 2025-11-29 07:48:21.006 187156 DEBUG oslo_concurrency.lockutils [None req-6ce6ef29-fc13-40b5-95c0-9635211eea60 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:21 np0005539504 podman[248602]: 2025-11-29 07:48:21.038951137 +0000 UTC m=+0.062523049 container create a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:48:21 np0005539504 systemd[1]: Started libpod-conmon-a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19.scope.
Nov 29 02:48:21 np0005539504 podman[248602]: 2025-11-29 07:48:21.005712205 +0000 UTC m=+0.029284137 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:48:21 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:48:21 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea0edd2b94d1f1ac6c23be3444fd023ea59d62b90aff33329b42952da3aa5089/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:48:21 np0005539504 podman[248602]: 2025-11-29 07:48:21.139042333 +0000 UTC m=+0.162614265 container init a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:48:21 np0005539504 podman[248602]: 2025-11-29 07:48:21.144291763 +0000 UTC m=+0.167863685 container start a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:48:21 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [NOTICE]   (248622) : New worker (248624) forked
Nov 29 02:48:21 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [NOTICE]   (248622) : Loading success.
Nov 29 02:48:21 np0005539504 nova_compute[187152]: 2025-11-29 07:48:21.930 187156 DEBUG nova.network.neutron [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updated VIF entry in instance network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:48:21 np0005539504 nova_compute[187152]: 2025-11-29 07:48:21.932 187156 DEBUG nova.network.neutron [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:21 np0005539504 nova_compute[187152]: 2025-11-29 07:48:21.963 187156 DEBUG oslo_concurrency.lockutils [req-6726e7d3-b81d-4fbd-adaf-98abe841b107 req-17346940-2725-4292-8c83-4d685eff8df8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:22 np0005539504 nova_compute[187152]: 2025-11-29 07:48:22.801 187156 DEBUG nova.compute.manager [req-af4da123-20d7-4b78-9f01-70274186f1ca req-aefbbe4a-06e9-4510-af4e-3ba84168ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:22 np0005539504 nova_compute[187152]: 2025-11-29 07:48:22.801 187156 DEBUG oslo_concurrency.lockutils [req-af4da123-20d7-4b78-9f01-70274186f1ca req-aefbbe4a-06e9-4510-af4e-3ba84168ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:22 np0005539504 nova_compute[187152]: 2025-11-29 07:48:22.801 187156 DEBUG oslo_concurrency.lockutils [req-af4da123-20d7-4b78-9f01-70274186f1ca req-aefbbe4a-06e9-4510-af4e-3ba84168ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:22 np0005539504 nova_compute[187152]: 2025-11-29 07:48:22.801 187156 DEBUG oslo_concurrency.lockutils [req-af4da123-20d7-4b78-9f01-70274186f1ca req-aefbbe4a-06e9-4510-af4e-3ba84168ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:22 np0005539504 nova_compute[187152]: 2025-11-29 07:48:22.801 187156 DEBUG nova.compute.manager [req-af4da123-20d7-4b78-9f01-70274186f1ca req-aefbbe4a-06e9-4510-af4e-3ba84168ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:22 np0005539504 nova_compute[187152]: 2025-11-29 07:48:22.802 187156 WARNING nova.compute.manager [req-af4da123-20d7-4b78-9f01-70274186f1ca req-aefbbe4a-06e9-4510-af4e-3ba84168ac86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:48:23 np0005539504 nova_compute[187152]: 2025-11-29 07:48:23.006 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:23.485 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:23.486 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:23.487 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:23 np0005539504 nova_compute[187152]: 2025-11-29 07:48:23.807 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.299 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.300 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.322 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.498 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.499 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.507 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.508 187156 INFO nova.compute.claims [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.512 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.746 187156 DEBUG nova.compute.provider_tree [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.771 187156 DEBUG nova.scheduler.client.report [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.806 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.807 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.869 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.869 187156 DEBUG nova.network.neutron [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.885 187156 INFO nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:48:24 np0005539504 nova_compute[187152]: 2025-11-29 07:48:24.902 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:48:25 np0005539504 NetworkManager[55210]: <info>  [1764402505.0415] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Nov 29 02:48:25 np0005539504 NetworkManager[55210]: <info>  [1764402505.0427] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.050 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.054 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.056 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.057 187156 INFO nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Creating image(s)#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.057 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "/var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.058 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "/var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.059 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "/var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.074 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.142 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.144 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.144 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.156 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:25.169 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:25.170 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:48:25 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:25.172 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.213 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:25 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:25Z|00686|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.223 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.224 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.243 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:25 np0005539504 nova_compute[187152]: 2025-11-29 07:48:25.998 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk 1073741824" returned: 0 in 0.774s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.001 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.002 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.068 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.070 187156 DEBUG nova.virt.disk.api [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Checking if we can resize image /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.071 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.136 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.137 187156 DEBUG nova.virt.disk.api [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Cannot resize image /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.138 187156 DEBUG nova.objects.instance [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'migration_context' on Instance uuid 421821a4-de27-4068-a398-1fa04c2f928b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.196 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.197 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Ensure instance console log exists: /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.197 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.198 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:26 np0005539504 nova_compute[187152]: 2025-11-29 07:48:26.198 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:27 np0005539504 podman[248651]: 2025-11-29 07:48:27.724275124 +0000 UTC m=+0.057401241 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:48:27 np0005539504 podman[248650]: 2025-11-29 07:48:27.742922244 +0000 UTC m=+0.063340970 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:48:27 np0005539504 podman[248649]: 2025-11-29 07:48:27.761298227 +0000 UTC m=+0.088430093 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.008 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.041 187156 DEBUG nova.compute.manager [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.042 187156 DEBUG nova.compute.manager [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing instance network info cache due to event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.042 187156 DEBUG oslo_concurrency.lockutils [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.042 187156 DEBUG oslo_concurrency.lockutils [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.042 187156 DEBUG nova.network.neutron [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:48:28 np0005539504 nova_compute[187152]: 2025-11-29 07:48:28.810 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:29 np0005539504 nova_compute[187152]: 2025-11-29 07:48:29.874 187156 DEBUG nova.network.neutron [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Successfully created port: 301ab096-9746-4794-9329-53d21fd7d8da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:48:29 np0005539504 nova_compute[187152]: 2025-11-29 07:48:29.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:29 np0005539504 nova_compute[187152]: 2025-11-29 07:48:29.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:29 np0005539504 nova_compute[187152]: 2025-11-29 07:48:29.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:48:30 np0005539504 nova_compute[187152]: 2025-11-29 07:48:30.859 187156 DEBUG nova.network.neutron [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updated VIF entry in instance network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:48:30 np0005539504 nova_compute[187152]: 2025-11-29 07:48:30.861 187156 DEBUG nova.network.neutron [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:31 np0005539504 nova_compute[187152]: 2025-11-29 07:48:31.160 187156 DEBUG oslo_concurrency.lockutils [req-7b887435-98ab-4592-9ac7-0555afa0e733 req-ce12883b-26e6-4294-bb3d-d63fccead11b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.588 187156 DEBUG nova.network.neutron [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Successfully updated port: 301ab096-9746-4794-9329-53d21fd7d8da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.688 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "refresh_cache-421821a4-de27-4068-a398-1fa04c2f928b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.689 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquired lock "refresh_cache-421821a4-de27-4068-a398-1fa04c2f928b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.689 187156 DEBUG nova.network.neutron [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:48:32 np0005539504 podman[248727]: 2025-11-29 07:48:32.722502147 +0000 UTC m=+0.059463027 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:48:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:32Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:d5:42 10.100.0.7
Nov 29 02:48:32 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:32Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:d5:42 10.100.0.7
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.922 187156 DEBUG nova.network.neutron [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:32 np0005539504 nova_compute[187152]: 2025-11-29 07:48:32.962 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.013 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.047 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:33 np0005539504 podman[248751]: 2025-11-29 07:48:33.113259511 +0000 UTC m=+0.091768673 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.117 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.118 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.174 187156 DEBUG oslo_concurrency.processutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.318 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.320 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5500MB free_disk=72.97832107543945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.320 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.320 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.407 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance f0ee4a50-ced9-4062-8d91-0296cc909668 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.408 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Instance 421821a4-de27-4068-a398-1fa04c2f928b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.408 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.408 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.482 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.497 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.519 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.519 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:33 np0005539504 nova_compute[187152]: 2025-11-29 07:48:33.813 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.438 187156 DEBUG nova.compute.manager [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received event network-changed-301ab096-9746-4794-9329-53d21fd7d8da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.439 187156 DEBUG nova.compute.manager [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Refreshing instance network info cache due to event network-changed-301ab096-9746-4794-9329-53d21fd7d8da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.439 187156 DEBUG oslo_concurrency.lockutils [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-421821a4-de27-4068-a398-1fa04c2f928b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.935 187156 DEBUG nova.network.neutron [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Updating instance_info_cache with network_info: [{"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.968 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Releasing lock "refresh_cache-421821a4-de27-4068-a398-1fa04c2f928b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.969 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Instance network_info: |[{"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.969 187156 DEBUG oslo_concurrency.lockutils [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-421821a4-de27-4068-a398-1fa04c2f928b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.969 187156 DEBUG nova.network.neutron [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Refreshing network info cache for port 301ab096-9746-4794-9329-53d21fd7d8da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.972 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Start _get_guest_xml network_info=[{"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.976 187156 WARNING nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.980 187156 DEBUG nova.virt.libvirt.host [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.981 187156 DEBUG nova.virt.libvirt.host [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.986 187156 DEBUG nova.virt.libvirt.host [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.988 187156 DEBUG nova.virt.libvirt.host [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.989 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.989 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.990 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.990 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.990 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.990 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.991 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.991 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.991 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.991 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.992 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.992 187156 DEBUG nova.virt.hardware [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.996 187156 DEBUG nova.virt.libvirt.vif [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-831703834',display_name='tempest-TestServerMultinode-server-831703834',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-831703834',id=171,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-n9c110qx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:24Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=421821a4-de27-4068-a398-1fa04c2f928b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.996 187156 DEBUG nova.network.os_vif_util [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.997 187156 DEBUG nova.network.os_vif_util [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:34 np0005539504 nova_compute[187152]: 2025-11-29 07:48:34.998 187156 DEBUG nova.objects.instance [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 421821a4-de27-4068-a398-1fa04c2f928b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.017 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <uuid>421821a4-de27-4068-a398-1fa04c2f928b</uuid>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <name>instance-000000ab</name>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestServerMultinode-server-831703834</nova:name>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:48:34</nova:creationTime>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:user uuid="b79809b822b248ae8be15d0233f5896e">tempest-TestServerMultinode-521650901-project-admin</nova:user>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:project uuid="220340bd80db4bf5af391eb2e4247a6c">tempest-TestServerMultinode-521650901</nova:project>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        <nova:port uuid="301ab096-9746-4794-9329-53d21fd7d8da">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <entry name="serial">421821a4-de27-4068-a398-1fa04c2f928b</entry>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <entry name="uuid">421821a4-de27-4068-a398-1fa04c2f928b</entry>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.config"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:c4:1c:74"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <target dev="tap301ab096-97"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/console.log" append="off"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:48:35 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:48:35 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:48:35 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:48:35 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.019 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Preparing to wait for external event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.019 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.019 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.020 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.020 187156 DEBUG nova.virt.libvirt.vif [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-831703834',display_name='tempest-TestServerMultinode-server-831703834',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-831703834',id=171,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-n9c110qx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:24Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=421821a4-de27-4068-a398-1fa04c2f928b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.021 187156 DEBUG nova.network.os_vif_util [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.021 187156 DEBUG nova.network.os_vif_util [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.022 187156 DEBUG os_vif [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.022 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.023 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.023 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.030 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap301ab096-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.031 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap301ab096-97, col_values=(('external_ids', {'iface-id': '301ab096-9746-4794-9329-53d21fd7d8da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:1c:74', 'vm-uuid': '421821a4-de27-4068-a398-1fa04c2f928b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.033 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:35 np0005539504 NetworkManager[55210]: <info>  [1764402515.0342] manager: (tap301ab096-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.034 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.040 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.042 187156 INFO os_vif [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97')#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.128 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.128 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.129 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No VIF found with MAC fa:16:3e:c4:1c:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:48:35 np0005539504 nova_compute[187152]: 2025-11-29 07:48:35.129 187156 INFO nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Using config drive#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.117 187156 INFO nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Creating config drive at /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.config#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.122 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3ad7z92w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.259 187156 DEBUG oslo_concurrency.processutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3ad7z92w" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:36 np0005539504 kernel: tap301ab096-97: entered promiscuous mode
Nov 29 02:48:36 np0005539504 NetworkManager[55210]: <info>  [1764402516.3151] manager: (tap301ab096-97): new Tun device (/org/freedesktop/NetworkManager/Devices/311)
Nov 29 02:48:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:36Z|00687|binding|INFO|Claiming lport 301ab096-9746-4794-9329-53d21fd7d8da for this chassis.
Nov 29 02:48:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:36Z|00688|binding|INFO|301ab096-9746-4794-9329-53d21fd7d8da: Claiming fa:16:3e:c4:1c:74 10.100.0.14
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:36Z|00689|binding|INFO|Setting lport 301ab096-9746-4794-9329-53d21fd7d8da ovn-installed in OVS
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.328 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:36 np0005539504 systemd-udevd[248802]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:48:36 np0005539504 systemd-machined[153423]: New machine qemu-88-instance-000000ab.
Nov 29 02:48:36 np0005539504 NetworkManager[55210]: <info>  [1764402516.3536] device (tap301ab096-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:48:36 np0005539504 NetworkManager[55210]: <info>  [1764402516.3547] device (tap301ab096-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:48:36 np0005539504 systemd[1]: Started Virtual Machine qemu-88-instance-000000ab.
Nov 29 02:48:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:36Z|00690|binding|INFO|Setting lport 301ab096-9746-4794-9329-53d21fd7d8da up in Southbound
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.464 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:1c:74 10.100.0.14'], port_security=['fa:16:3e:c4:1c:74 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '421821a4-de27-4068-a398-1fa04c2f928b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '220340bd80db4bf5af391eb2e4247a6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d07af2a-16f6-4fe3-b2a4-ed6b96a38a93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb62e23-e8c7-432f-b445-db50c529fe8e, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=301ab096-9746-4794-9329-53d21fd7d8da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.466 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 301ab096-9746-4794-9329-53d21fd7d8da in datapath 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d bound to our chassis#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.468 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.479 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[16b7bc7e-66c7-4690-8b0f-fb4d3c4f7f0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.480 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7fbe5e7f-51 in ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.483 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7fbe5e7f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.483 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[9dad242e-38bd-4d25-8e8d-f9c6f6c884ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.484 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b41b995a-c412-43aa-bd84-57718390c590]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.502 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[a97f2d98-d8f0-4635-bc9c-cc9cfdfab1a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.519 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.520 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.521 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.528 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cc71c928-1abe-4f8b-84f0-c3ad4c6589a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.566 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f3777f07-31fe-4eef-8c96-79ba97b2be59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.573 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[76167edb-e3f7-45c6-babe-9ed39a053abd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 systemd-udevd[248805]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:48:36 np0005539504 NetworkManager[55210]: <info>  [1764402516.5753] manager: (tap7fbe5e7f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/312)
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.615 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[21ffe5ac-5208-4ee7-bc95-dd47e8d4f2d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.619 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[511f8ab9-cc6b-4ff8-8e99-7a49a5979df1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 NetworkManager[55210]: <info>  [1764402516.6393] device (tap7fbe5e7f-50): carrier: link connected
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.647 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[65bfc7d6-ad69-44f3-aee8-15305e068dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.670 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[464370e3-e4f2-47d1-871f-4e7252be1072]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7fbe5e7f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:bf:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797951, 'reachable_time': 20780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248836, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.687 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7e668dda-3334-46d6-8c5c-5c2c336f52ce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:bfc0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 797951, 'tstamp': 797951}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248837, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.710 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ec196071-aca6-422e-ab91-700065a1fc1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7fbe5e7f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:bf:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797951, 'reachable_time': 20780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248838, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.748 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e41e5ce8-08ac-4f9d-b5a7-7ee2ccb35f54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.811 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b33a01-0b2a-4996-935c-96edd1de70d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.812 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fbe5e7f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.812 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.813 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7fbe5e7f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:36 np0005539504 kernel: tap7fbe5e7f-50: entered promiscuous mode
Nov 29 02:48:36 np0005539504 NetworkManager[55210]: <info>  [1764402516.8154] manager: (tap7fbe5e7f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.817 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7fbe5e7f-50, col_values=(('external_ids', {'iface-id': 'e08502a1-bdde-4e8d-89e4-c05bd265f847'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.817 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:36 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:36Z|00691|binding|INFO|Releasing lport e08502a1-bdde-4e8d-89e4-c05bd265f847 from this chassis (sb_readonly=1)
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.829 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.831 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.832 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[14e9a27d-34ff-4d14-a503-586d89ac6c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.832 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:48:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:36.833 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'env', 'PROCESS_TAG=haproxy-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.926 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402516.9254332, 421821a4-de27-4068-a398-1fa04c2f928b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:36 np0005539504 nova_compute[187152]: 2025-11-29 07:48:36.927 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:48:37 np0005539504 nova_compute[187152]: 2025-11-29 07:48:37.149 187156 DEBUG nova.network.neutron [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Updated VIF entry in instance network info cache for port 301ab096-9746-4794-9329-53d21fd7d8da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:48:37 np0005539504 nova_compute[187152]: 2025-11-29 07:48:37.150 187156 DEBUG nova.network.neutron [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Updating instance_info_cache with network_info: [{"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:37 np0005539504 podman[248877]: 2025-11-29 07:48:37.20527216 +0000 UTC m=+0.023395538 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.016 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.084 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.100 187156 DEBUG oslo_concurrency.lockutils [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-421821a4-de27-4068-a398-1fa04c2f928b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.101 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.106 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402516.925743, 421821a4-de27-4068-a398-1fa04c2f928b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.106 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:48:38 np0005539504 podman[248877]: 2025-11-29 07:48:38.147449878 +0000 UTC m=+0.965573236 container create 42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.174 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.181 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.238 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:48:38 np0005539504 systemd[1]: Started libpod-conmon-42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5.scope.
Nov 29 02:48:38 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:48:38 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cd831e358a542aa7c5ad098ea75393a73abc190c74518bb7757c75a7637b33e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:48:38 np0005539504 podman[248877]: 2025-11-29 07:48:38.292617803 +0000 UTC m=+1.110741181 container init 42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:48:38 np0005539504 podman[248877]: 2025-11-29 07:48:38.302007466 +0000 UTC m=+1.120130824 container start 42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:48:38 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [NOTICE]   (248897) : New worker (248899) forked
Nov 29 02:48:38 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [NOTICE]   (248897) : Loading success.
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.521 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.522 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.522 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.523 187156 DEBUG nova.objects.instance [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lazy-loading 'info_cache' on Instance uuid f0ee4a50-ced9-4062-8d91-0296cc909668 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.606 187156 DEBUG nova.compute.manager [req-11e3984e-3b90-44b7-8482-f1701c43d1c2 req-9e2ac268-27fa-40e9-b88e-a929f76c14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.606 187156 DEBUG oslo_concurrency.lockutils [req-11e3984e-3b90-44b7-8482-f1701c43d1c2 req-9e2ac268-27fa-40e9-b88e-a929f76c14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.606 187156 DEBUG oslo_concurrency.lockutils [req-11e3984e-3b90-44b7-8482-f1701c43d1c2 req-9e2ac268-27fa-40e9-b88e-a929f76c14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.606 187156 DEBUG oslo_concurrency.lockutils [req-11e3984e-3b90-44b7-8482-f1701c43d1c2 req-9e2ac268-27fa-40e9-b88e-a929f76c14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.607 187156 DEBUG nova.compute.manager [req-11e3984e-3b90-44b7-8482-f1701c43d1c2 req-9e2ac268-27fa-40e9-b88e-a929f76c14b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Processing event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.608 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.613 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402518.6133952, 421821a4-de27-4068-a398-1fa04c2f928b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.613 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.615 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.618 187156 INFO nova.virt.libvirt.driver [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Instance spawned successfully.#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.618 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.639 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.646 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.646 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.647 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.647 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.648 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.648 187156 DEBUG nova.virt.libvirt.driver [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.653 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.682 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.742 187156 INFO nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Took 13.69 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.742 187156 DEBUG nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.838 187156 INFO nova.compute.manager [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Took 14.40 seconds to build instance.#033[00m
Nov 29 02:48:38 np0005539504 nova_compute[187152]: 2025-11-29 07:48:38.857 187156 DEBUG oslo_concurrency.lockutils [None req-6dada36b-59e3-4c23-aede-69a6393d08ee b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.035 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.461 187156 DEBUG nova.network.neutron [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.477 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.477 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.477 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.478 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.702 187156 DEBUG nova.compute.manager [req-a5791e7c-adc0-49b2-a104-ddcc7b155803 req-4d416f2d-43c9-4152-a501-031d3ad2f7ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.703 187156 DEBUG oslo_concurrency.lockutils [req-a5791e7c-adc0-49b2-a104-ddcc7b155803 req-4d416f2d-43c9-4152-a501-031d3ad2f7ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.703 187156 DEBUG oslo_concurrency.lockutils [req-a5791e7c-adc0-49b2-a104-ddcc7b155803 req-4d416f2d-43c9-4152-a501-031d3ad2f7ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.703 187156 DEBUG oslo_concurrency.lockutils [req-a5791e7c-adc0-49b2-a104-ddcc7b155803 req-4d416f2d-43c9-4152-a501-031d3ad2f7ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.703 187156 DEBUG nova.compute.manager [req-a5791e7c-adc0-49b2-a104-ddcc7b155803 req-4d416f2d-43c9-4152-a501-031d3ad2f7ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] No waiting events found dispatching network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:40 np0005539504 nova_compute[187152]: 2025-11-29 07:48:40.703 187156 WARNING nova.compute.manager [req-a5791e7c-adc0-49b2-a104-ddcc7b155803 req-4d416f2d-43c9-4152-a501-031d3ad2f7ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received unexpected event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da for instance with vm_state active and task_state None.#033[00m
Nov 29 02:48:40 np0005539504 podman[248908]: 2025-11-29 07:48:40.75616783 +0000 UTC m=+0.075483506 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:48:43 np0005539504 nova_compute[187152]: 2025-11-29 07:48:43.019 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:45 np0005539504 nova_compute[187152]: 2025-11-29 07:48:45.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:46 np0005539504 podman[248928]: 2025-11-29 07:48:46.732360372 +0000 UTC m=+0.063062393 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.912 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.912 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.912 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.913 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.913 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.925 187156 INFO nova.compute.manager [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Terminating instance#033[00m
Nov 29 02:48:46 np0005539504 nova_compute[187152]: 2025-11-29 07:48:46.937 187156 DEBUG nova.compute.manager [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:48:46 np0005539504 kernel: tap301ab096-97 (unregistering): left promiscuous mode
Nov 29 02:48:46 np0005539504 NetworkManager[55210]: <info>  [1764402526.9582] device (tap301ab096-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:48:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:47Z|00692|binding|INFO|Releasing lport 301ab096-9746-4794-9329-53d21fd7d8da from this chassis (sb_readonly=0)
Nov 29 02:48:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:47Z|00693|binding|INFO|Setting lport 301ab096-9746-4794-9329-53d21fd7d8da down in Southbound
Nov 29 02:48:47 np0005539504 ovn_controller[95182]: 2025-11-29T07:48:47Z|00694|binding|INFO|Removing iface tap301ab096-97 ovn-installed in OVS
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.004 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.007 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.017 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:47 np0005539504 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Nov 29 02:48:47 np0005539504 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000ab.scope: Consumed 8.942s CPU time.
Nov 29 02:48:47 np0005539504 systemd-machined[153423]: Machine qemu-88-instance-000000ab terminated.
Nov 29 02:48:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:47.182 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:1c:74 10.100.0.14'], port_security=['fa:16:3e:c4:1c:74 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '421821a4-de27-4068-a398-1fa04c2f928b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '220340bd80db4bf5af391eb2e4247a6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d07af2a-16f6-4fe3-b2a4-ed6b96a38a93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb62e23-e8c7-432f-b445-db50c529fe8e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=301ab096-9746-4794-9329-53d21fd7d8da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:47.186 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 301ab096-9746-4794-9329-53d21fd7d8da in datapath 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d unbound from our chassis#033[00m
Nov 29 02:48:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:47.189 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:48:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:47.192 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0445df7c-4d0e-4dc1-8350-66a1d21d1cde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:47.193 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d namespace which is not needed anymore#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.226 187156 INFO nova.virt.libvirt.driver [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Instance destroyed successfully.#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.227 187156 DEBUG nova.objects.instance [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'resources' on Instance uuid 421821a4-de27-4068-a398-1fa04c2f928b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.250 187156 DEBUG nova.virt.libvirt.vif [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:48:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-831703834',display_name='tempest-TestServerMultinode-server-831703834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-831703834',id=171,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:48:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-n9c110qx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:48:38Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=421821a4-de27-4068-a398-1fa04c2f928b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.251 187156 DEBUG nova.network.os_vif_util [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "301ab096-9746-4794-9329-53d21fd7d8da", "address": "fa:16:3e:c4:1c:74", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap301ab096-97", "ovs_interfaceid": "301ab096-9746-4794-9329-53d21fd7d8da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.251 187156 DEBUG nova.network.os_vif_util [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.252 187156 DEBUG os_vif [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.255 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.255 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap301ab096-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.257 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.259 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.262 187156 INFO os_vif [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:1c:74,bridge_name='br-int',has_traffic_filtering=True,id=301ab096-9746-4794-9329-53d21fd7d8da,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap301ab096-97')#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.263 187156 INFO nova.virt.libvirt.driver [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Deleting instance files /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b_del#033[00m
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.264 187156 INFO nova.virt.libvirt.driver [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Deletion of /var/lib/nova/instances/421821a4-de27-4068-a398-1fa04c2f928b_del complete#033[00m
Nov 29 02:48:47 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [NOTICE]   (248897) : haproxy version is 2.8.14-c23fe91
Nov 29 02:48:47 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [NOTICE]   (248897) : path to executable is /usr/sbin/haproxy
Nov 29 02:48:47 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [WARNING]  (248897) : Exiting Master process...
Nov 29 02:48:47 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [ALERT]    (248897) : Current worker (248899) exited with code 143 (Terminated)
Nov 29 02:48:47 np0005539504 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[248892]: [WARNING]  (248897) : All workers exited. Exiting... (0)
Nov 29 02:48:47 np0005539504 systemd[1]: libpod-42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5.scope: Deactivated successfully.
Nov 29 02:48:47 np0005539504 podman[248989]: 2025-11-29 07:48:47.462197073 +0000 UTC m=+0.144171109 container died 42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:48:47 np0005539504 nova_compute[187152]: 2025-11-29 07:48:47.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5-userdata-shm.mount: Deactivated successfully.
Nov 29 02:48:47 np0005539504 systemd[1]: var-lib-containers-storage-overlay-9cd831e358a542aa7c5ad098ea75393a73abc190c74518bb7757c75a7637b33e-merged.mount: Deactivated successfully.
Nov 29 02:48:47 np0005539504 podman[248989]: 2025-11-29 07:48:47.972896566 +0000 UTC m=+0.654870582 container cleanup 42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:48:47 np0005539504 systemd[1]: libpod-conmon-42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5.scope: Deactivated successfully.
Nov 29 02:48:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:47.994 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'name': 'tempest-TestNetworkBasicOps-server-1803258578', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a9', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:48:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:47.999 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '421821a4-de27-4068-a398-1fa04c2f928b', 'name': 'tempest-TestServerMultinode-server-831703834', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ab', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '220340bd80db4bf5af391eb2e4247a6c', 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'hostId': '740973069c74fb82f9413284e6e4c354aa084367cfc6d40e98bc5dfc', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:47.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.020 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.033 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.034 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.036 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a8e99dc-950b-4cc2-a78d-f61d72961c89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.000166', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6bf7074-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': 'c6e609cd90b70a833a37cc15f6044554e2286c90b5784f2771a912210c31987e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.000166', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6bf8cbc-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '6eff0bfe76a541615b95c9ce9fd3e1c90e1b1968c69873310405d4931c015086'}]}, 'timestamp': '2025-11-29 07:48:48.036590', '_unique_id': 'f2e40b7b30864b0d988c2c7460623836'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.041 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.044 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f0ee4a50-ced9-4062-8d91-0296cc909668 / tap9f78664f-38 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.045 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.046 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24f29d1d-b314-4ba3-a0b0-4892486a19a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.041868', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6c12b08-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': '967c18bce3f2960200554e12e0026fbd6cdbec52d9daaba943d5bc420e9132ea'}]}, 'timestamp': '2025-11-29 07:48:48.046982', '_unique_id': 'ad95d1969b6543c28f8975dc0455fc78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.048 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.050 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.051 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87f2fa6b-b0af-423b-a22e-ee9b36121339', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.050110', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6c1de0e-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': 'e083123f5ed046e6b23e38efc83c87b09ef43493d8f2acc58e1181dccbb2d7e8'}]}, 'timestamp': '2025-11-29 07:48:48.051317', '_unique_id': '1dfd24ada9434eed880e6dd94d8ea22d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.053 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.063 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.064 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.065 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40ed995f-2dfb-4ffa-8e2f-4d82c29c0f42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.053648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6c3f68a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.988336953, 'message_signature': 'a68f75763581e1a489680dbddc52d592914a0d60ed553c4989dcd56939ede230'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.053648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6c40486-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.988336953, 'message_signature': '6bd7211509ede8e683c15594aaa5225f5bdf8f836e513b58832017559ed64c86'}]}, 'timestamp': '2025-11-29 07:48:48.065317', '_unique_id': '5ed87264125848a4acd2270f150364b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.066 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.067 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.read.latency volume: 176839736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.067 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.read.latency volume: 17124215 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.068 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6bfa79f-2a43-424b-8ec9-002f869c3c80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 176839736, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.067499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6c4835c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '8c75cba430ee9a4576fc9383e6e38ab7a8c9a6ffc4b6b22a57cd3653c36e3ef6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17124215, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.067499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6c48dde-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': 'd4b166a3520638125cd08a7a1fc3563ea01dc699d0d07c0adc1c858fa1d76a00'}]}, 'timestamp': '2025-11-29 07:48:48.068625', '_unique_id': '8fc0a151a8524cff9e9b900352673d77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.070 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.070 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.070 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.071 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd1cdc05-a45a-491a-bf7c-ebbc6a256687', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.070544', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6c4fbf2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '6a09772c61d23c5cd28980015368f8ddb0fee741b2bc99d4c05afcc880713be3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.070544', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6c50818-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': 'a5e9371c6bca733995347adf09f951be406be19b3ac6aca00b8a85dcaa6c6d38'}]}, 'timestamp': '2025-11-29 07:48:48.071977', '_unique_id': '00352085ce7e4a00829f65f4304f8a8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.073 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.074 12 DEBUG ceilometer.compute.pollsters [-] Instance 421821a4-de27-4068-a398-1fa04c2f928b was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90dc5318-907d-4f56-9103-b05e2dfe710d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.073771', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6c578e8-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': '39d5f0b49b1d00fbedaa8331278fb05a6041948c99dce419dc74be68e6c3e959'}]}, 'timestamp': '2025-11-29 07:48:48.074878', '_unique_id': 'db36ef741fb047928fbc5fa65d8743be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.076 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.084 187156 INFO nova.compute.manager [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Took 1.15 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.086 187156 DEBUG oslo.service.loopingcall [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.087 187156 DEBUG nova.compute.manager [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.088 187156 DEBUG nova.network.neutron [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.097 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/cpu volume: 11930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.098 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72a0bc7f-e1e9-4f32-87af-9e15596c28ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11930000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'timestamp': '2025-11-29T07:48:48.076515', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd6c91aa2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7991.031809219, 'message_signature': '65eb8b60d4ec26b30b170dc86b23b6dc7527d8ae43af55e853df0b65559e1046'}]}, 'timestamp': '2025-11-29 07:48:48.101733', '_unique_id': '3d666b9c83874efe87e89c4d31c3aacc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.104 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.105 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5eb9e2d-3c46-4ed4-a5ce-3bc0e1709a14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.104864', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6ca3874-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.988336953, 'message_signature': '1bf6a72ba93aec417414087775c09b3f72bb9cf1efc8e965a8555ac46ebeea91'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.104864', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6ca438c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.988336953, 'message_signature': 'c2a4b29617e32d555e065a221c374e140e34a96ef502b36785ad78f04cea9456'}]}, 'timestamp': '2025-11-29 07:48:48.106322', '_unique_id': '58287279e7c5456190284270014195cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.108 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.108 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50801f6b-c302-483a-9acf-8a39eaddc2eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.108087', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6cab4de-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': 'eb6aeb8bb07862af0a2e24672725e7bf9a04dad67e9f8af4dc0c73da2a8b01f0'}]}, 'timestamp': '2025-11-29 07:48:48.108879', '_unique_id': '6e56f4bce90643728e77f6ac23cde5d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49b8365e-c4f8-439e-a7df-47227a3440bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.110464', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6cb119a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': '4368c133d1a750fbdf258099e1550225b482b3a7eeb770d96dfc73118019b5eb'}]}, 'timestamp': '2025-11-29 07:48:48.111192', '_unique_id': '9a8da9c63095461b90c2875a774df4cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.112 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a726084e-c941-492b-8f37-0b00edf51c13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.112773', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6cb6b9a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': '086c86714d9c80612bc49ec24ca3edb1cdec9d5311e62be68baed723fc622249'}]}, 'timestamp': '2025-11-29 07:48:48.113514', '_unique_id': '327aea7b35134f19a382f4874417b408'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.114 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.114 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/memory.usage volume: 46.51171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.115 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2642933e-90c6-418f-a216-79e5bca9dfc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.51171875, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'timestamp': '2025-11-29T07:48:48.114954', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd6cbc144-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7991.031809219, 'message_signature': 'ca9f556a110281b2a69016e4976a3d9562a0910bbccedef9c1832cc3d05a10b6'}]}, 'timestamp': '2025-11-29 07:48:48.115738', '_unique_id': 'ff1c4f8289374540a6a22e3b7beb375f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.117 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.read.bytes volume: 30366208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.117 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.118 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42f21665-0dee-454a-95f3-fc3c69628690', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30366208, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.117284', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6cc1e8c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '2760de03d88ab89e0713828ea613729881fed2e2d925d05bd4ac67655c6d3029'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.117284', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6cc2ba2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '510b7e6ad8505d51ef0cdfc6f8212145c0b7a662f779342d3f18315b07de3e1d'}]}, 'timestamp': '2025-11-29 07:48:48.118702', '_unique_id': '672fd1c41f6141d1b95c42593ee99148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.120 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.121 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>]
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.121 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.write.latency volume: 3995613890 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.121 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.122 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '288a0d9e-c65e-4977-b0a7-306724142063', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3995613890, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.121505', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6ccc1fc-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': 'be4ebeb7cc509ae198cf21d43c8f0ce4e9433a22aa5e94400e47246a7ee828e6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.121505', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6ccceb8-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '665e3d03751caa2f927cc0a9af7a12972ea9971a75e0b8b4cc7a4624b8f78074'}]}, 'timestamp': '2025-11-29 07:48:48.122725', '_unique_id': '1c3c2a4e09d347a4a07953752b07da35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.124 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.125 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa6e63f0-109b-4ad4-8bfa-5cc782dd21bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.124699', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6cd3f1a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': 'be3ea5bc3f3f1344c729a6ba1eb9ad894eb4ff702a6df3fb210bda7878b06ec8'}]}, 'timestamp': '2025-11-29 07:48:48.125596', '_unique_id': 'bd2392e9c8544028be649f1f4aa795c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.127 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.128 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa2e3099-c1e6-4e5c-93be-0b436b525955', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.127702', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6cdb44a-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': 'bfdc34cb5c5c68efa34ee8f859e0447503f01586070bc3b759f273b446078e9c'}]}, 'timestamp': '2025-11-29 07:48:48.128661', '_unique_id': '88d1145f8aaf48559163c0628ec6cd6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.130 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.130 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>]
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.131 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.131 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae6368bd-9ec4-4b69-b323-4f1054bcea17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.131102', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6ce3910-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': '2c1a5dd3f7bee0bda289bf900908bd63a133356f277444dbe10b762e5217cbb0'}]}, 'timestamp': '2025-11-29 07:48:48.132008', '_unique_id': '353d3157a4b14bb7af11ecf82744f2d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.133 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.134 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>]
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.134 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.135 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8c1a0ab-7bb8-4692-94ea-2b0455579859', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000a9-f0ee4a50-ced9-4062-8d91-0296cc909668-tap9f78664f-38', 'timestamp': '2025-11-29T07:48:48.134503', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'tap9f78664f-38', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:d5:42', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f78664f-38'}, 'message_id': 'd6cebdf4-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.976626529, 'message_signature': 'b82f4ceec82339351cac70ee7a1dedcf1e0ae02a45dca82848cb7152686b2f71'}]}, 'timestamp': '2025-11-29 07:48:48.135408', '_unique_id': '7163a9854fd9493b82e7665f62ee7525'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.137 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.read.requests volume: 1088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.137 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.138 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd15a1c9a-b37e-4fa5-a4c1-b0169656c3cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1088, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.137311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6cf2cb2-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': 'c937ea05319d61ec370294a9c123cde1e7ba039d30b3485d73969477bb201857'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.137311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6cf398c-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.934882109, 'message_signature': '4a3ca9c124414f0a7026f5ba236b51eddc6f3686720f2db5b7da3123b0568e8d'}]}, 'timestamp': '2025-11-29 07:48:48.138547', '_unique_id': '682df5eaa6a54235b47b8b8841d8917a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.140 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.140 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1803258578>, <NovaLikeServer: tempest-TestServerMultinode-server-831703834>]
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.141 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.141 12 DEBUG ceilometer.compute.pollsters [-] f0ee4a50-ced9-4062-8d91-0296cc909668/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b'
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-000000ab, id=421821a4-de27-4068-a398-1fa04c2f928b>: [Error Code 42] Domain not found: no domain with matching uuid '421821a4-de27-4068-a398-1fa04c2f928b' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e66fcf77-22cc-4ed1-9f07-4cd4e09015fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-vda', 'timestamp': '2025-11-29T07:48:48.140969', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6cfba10-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.988336953, 'message_signature': '9380c584de077463b086da2292712215aef5f75a69b318a55c62a69f87b972b8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668-sda', 'timestamp': '2025-11-29T07:48:48.140969', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1803258578', 'name': 'instance-000000a9', 'instance_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'instance_type': 'm1.nano', 'host': 'e585670c9b7b83f21b0a81af31535d872968f92ce9f632984648a9b9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6cfc7a8-ccf7-11f0-8a11-fa163ea726b4', 'monotonic_time': 7990.988336953, 'message_signature': '0a067a73a75efb39060713d110e99fb19432357e14c2fc41af5de426a905b2a8'}]}, 'timestamp': '2025-11-29 07:48:48.142177', '_unique_id': 'edb3399008954a5cb2c8e5ce493913a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:48:48 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:48:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:48:48 np0005539504 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:48:48 np0005539504 podman[249020]: 2025-11-29 07:48:48.246844325 +0000 UTC m=+0.247015478 container remove 42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.253 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2d58a755-f264-4f75-bcfc-1916e5946b83]: (4, ('Sat Nov 29 07:48:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d (42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5)\n42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5\nSat Nov 29 07:48:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d (42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5)\n42660f0a31a4228f0abf7408c5c59585a0e4e7920463a06b14498706f02be8f5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.255 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[72f7a4c5-a47e-4db6-9bd0-194df40b1350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.256 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fbe5e7f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:48 np0005539504 kernel: tap7fbe5e7f-50: left promiscuous mode
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.259 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:48 np0005539504 nova_compute[187152]: 2025-11-29 07:48:48.278 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.282 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4541a60d-4c5f-434c-aa76-9ca88d0ebb09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.304 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b80ac7b4-4e54-49db-b0a4-ebec208a1705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.305 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[dc687e1b-b313-475d-850f-ca034286a695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.322 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba64c90-39de-4c30-aa50-a5c46e9c7ac7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797943, 'reachable_time': 24353, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249037, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.325 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:48:48 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:48:48.325 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[fc3fce79-c20a-451e-b95b-e4baa13a7d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:48 np0005539504 systemd[1]: run-netns-ovnmeta\x2d7fbe5e7f\x2d5bf0\x2d42e8\x2d9d22\x2dc7ee6968433d.mount: Deactivated successfully.
Nov 29 02:48:50 np0005539504 nova_compute[187152]: 2025-11-29 07:48:50.982 187156 DEBUG nova.network.neutron [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.001 187156 INFO nova.compute.manager [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Took 2.91 seconds to deallocate network for instance.#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.051 187156 DEBUG nova.compute.manager [req-3896f9e9-3239-4b61-94bc-fc9a12edb0d4 req-7a7736e2-e9da-4d28-989b-3a165edfc7d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received event network-vif-deleted-301ab096-9746-4794-9329-53d21fd7d8da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.075 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.076 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.133 187156 DEBUG nova.compute.provider_tree [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.146 187156 DEBUG nova.scheduler.client.report [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.178 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.207 187156 INFO nova.scheduler.client.report [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Deleted allocations for instance 421821a4-de27-4068-a398-1fa04c2f928b#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.284 187156 DEBUG oslo_concurrency.lockutils [None req-3745a126-f0b6-413c-a02f-96786dc7580c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.413 187156 DEBUG nova.compute.manager [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received event network-vif-unplugged-301ab096-9746-4794-9329-53d21fd7d8da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.414 187156 DEBUG oslo_concurrency.lockutils [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.414 187156 DEBUG oslo_concurrency.lockutils [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.414 187156 DEBUG oslo_concurrency.lockutils [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.415 187156 DEBUG nova.compute.manager [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] No waiting events found dispatching network-vif-unplugged-301ab096-9746-4794-9329-53d21fd7d8da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.415 187156 WARNING nova.compute.manager [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received unexpected event network-vif-unplugged-301ab096-9746-4794-9329-53d21fd7d8da for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.415 187156 DEBUG nova.compute.manager [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.415 187156 DEBUG oslo_concurrency.lockutils [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "421821a4-de27-4068-a398-1fa04c2f928b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.416 187156 DEBUG oslo_concurrency.lockutils [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.416 187156 DEBUG oslo_concurrency.lockutils [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "421821a4-de27-4068-a398-1fa04c2f928b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.416 187156 DEBUG nova.compute.manager [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] No waiting events found dispatching network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.417 187156 WARNING nova.compute.manager [req-8ba012f7-7539-4ed8-b0c4-b9c001f5de6f req-b0e5a8b4-9440-4c45-ad31-312a0fc649de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Received unexpected event network-vif-plugged-301ab096-9746-4794-9329-53d21fd7d8da for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:48:51 np0005539504 nova_compute[187152]: 2025-11-29 07:48:51.933 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:52 np0005539504 nova_compute[187152]: 2025-11-29 07:48:52.258 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:53 np0005539504 nova_compute[187152]: 2025-11-29 07:48:53.021 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:57 np0005539504 nova_compute[187152]: 2025-11-29 07:48:57.261 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:58 np0005539504 nova_compute[187152]: 2025-11-29 07:48:58.024 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:58 np0005539504 podman[249038]: 2025-11-29 07:48:58.76903439 +0000 UTC m=+0.095495579 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:48:58 np0005539504 podman[249039]: 2025-11-29 07:48:58.772841643 +0000 UTC m=+0.098543441 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:48:58 np0005539504 podman[249040]: 2025-11-29 07:48:58.783936003 +0000 UTC m=+0.095906060 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:49:02 np0005539504 nova_compute[187152]: 2025-11-29 07:49:02.224 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402527.222994, 421821a4-de27-4068-a398-1fa04c2f928b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:02 np0005539504 nova_compute[187152]: 2025-11-29 07:49:02.224 187156 INFO nova.compute.manager [-] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:49:02 np0005539504 nova_compute[187152]: 2025-11-29 07:49:02.289 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:02 np0005539504 nova_compute[187152]: 2025-11-29 07:49:02.555 187156 DEBUG nova.compute.manager [None req-c13ea543-129d-4367-bd5a-af401cda34d6 - - - - - -] [instance: 421821a4-de27-4068-a398-1fa04c2f928b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:03 np0005539504 nova_compute[187152]: 2025-11-29 07:49:03.027 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:03 np0005539504 podman[249099]: 2025-11-29 07:49:03.769518792 +0000 UTC m=+0.097340409 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:49:03 np0005539504 podman[249100]: 2025-11-29 07:49:03.771583467 +0000 UTC m=+0.103554826 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:49:04 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:04Z|00695|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 02:49:04 np0005539504 nova_compute[187152]: 2025-11-29 07:49:04.647 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:07 np0005539504 nova_compute[187152]: 2025-11-29 07:49:07.292 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:08 np0005539504 nova_compute[187152]: 2025-11-29 07:49:08.068 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:10 np0005539504 nova_compute[187152]: 2025-11-29 07:49:10.941 187156 INFO nova.compute.manager [None req-e1a2041e-2f30-48ad-8a56-3fb1a3be2ae0 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Get console output#033[00m
Nov 29 02:49:10 np0005539504 nova_compute[187152]: 2025-11-29 07:49:10.947 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:49:11 np0005539504 podman[249152]: 2025-11-29 07:49:11.742625978 +0000 UTC m=+0.085341015 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.146 187156 DEBUG nova.compute.manager [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.146 187156 DEBUG nova.compute.manager [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing instance network info cache due to event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.146 187156 DEBUG oslo_concurrency.lockutils [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.147 187156 DEBUG oslo_concurrency.lockutils [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.147 187156 DEBUG nova.network.neutron [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.296 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.308 187156 DEBUG nova.compute.manager [req-da978a88-2b2a-45ce-8aea-209739f7b65c req-ee9567b3-7a99-4453-ad89-1a48de39db9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-unplugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.308 187156 DEBUG oslo_concurrency.lockutils [req-da978a88-2b2a-45ce-8aea-209739f7b65c req-ee9567b3-7a99-4453-ad89-1a48de39db9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.308 187156 DEBUG oslo_concurrency.lockutils [req-da978a88-2b2a-45ce-8aea-209739f7b65c req-ee9567b3-7a99-4453-ad89-1a48de39db9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.309 187156 DEBUG oslo_concurrency.lockutils [req-da978a88-2b2a-45ce-8aea-209739f7b65c req-ee9567b3-7a99-4453-ad89-1a48de39db9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.309 187156 DEBUG nova.compute.manager [req-da978a88-2b2a-45ce-8aea-209739f7b65c req-ee9567b3-7a99-4453-ad89-1a48de39db9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-unplugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:12 np0005539504 nova_compute[187152]: 2025-11-29 07:49:12.309 187156 WARNING nova.compute.manager [req-da978a88-2b2a-45ce-8aea-209739f7b65c req-ee9567b3-7a99-4453-ad89-1a48de39db9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-unplugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:49:13 np0005539504 nova_compute[187152]: 2025-11-29 07:49:13.071 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:13 np0005539504 nova_compute[187152]: 2025-11-29 07:49:13.256 187156 INFO nova.compute.manager [None req-27d8a686-cea9-4304-b2c7-60ed6a02018c 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Get console output#033[00m
Nov 29 02:49:13 np0005539504 nova_compute[187152]: 2025-11-29 07:49:13.262 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:49:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:14Z|00696|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.135 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.422 187156 DEBUG nova.compute.manager [req-49f78011-cafd-4831-83c9-1403de86f671 req-cd7b68e3-ff8e-414d-ba0e-5f033e69b1c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.422 187156 DEBUG oslo_concurrency.lockutils [req-49f78011-cafd-4831-83c9-1403de86f671 req-cd7b68e3-ff8e-414d-ba0e-5f033e69b1c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.422 187156 DEBUG oslo_concurrency.lockutils [req-49f78011-cafd-4831-83c9-1403de86f671 req-cd7b68e3-ff8e-414d-ba0e-5f033e69b1c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.423 187156 DEBUG oslo_concurrency.lockutils [req-49f78011-cafd-4831-83c9-1403de86f671 req-cd7b68e3-ff8e-414d-ba0e-5f033e69b1c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.423 187156 DEBUG nova.compute.manager [req-49f78011-cafd-4831-83c9-1403de86f671 req-cd7b68e3-ff8e-414d-ba0e-5f033e69b1c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.423 187156 WARNING nova.compute.manager [req-49f78011-cafd-4831-83c9-1403de86f671 req-cd7b68e3-ff8e-414d-ba0e-5f033e69b1c4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.816 187156 DEBUG nova.network.neutron [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updated VIF entry in instance network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.816 187156 DEBUG nova.network.neutron [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:14 np0005539504 nova_compute[187152]: 2025-11-29 07:49:14.833 187156 DEBUG oslo_concurrency.lockutils [req-e9565b29-6305-4188-98e8-88c58767adfe req-195f55a8-ffa1-460c-87f5-d46b3e58ce88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.135 187156 DEBUG nova.compute.manager [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.136 187156 DEBUG nova.compute.manager [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing instance network info cache due to event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.137 187156 DEBUG oslo_concurrency.lockutils [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.137 187156 DEBUG oslo_concurrency.lockutils [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.138 187156 DEBUG nova.network.neutron [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.949 187156 INFO nova.compute.manager [None req-1fd6e8d0-d80b-4f2c-9ccb-1b5c6a27197b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Get console output#033[00m
Nov 29 02:49:15 np0005539504 nova_compute[187152]: 2025-11-29 07:49:15.954 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.663 187156 DEBUG nova.compute.manager [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.663 187156 DEBUG oslo_concurrency.lockutils [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.664 187156 DEBUG oslo_concurrency.lockutils [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.664 187156 DEBUG oslo_concurrency.lockutils [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.665 187156 DEBUG nova.compute.manager [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.665 187156 WARNING nova.compute.manager [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.666 187156 DEBUG nova.compute.manager [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.666 187156 DEBUG oslo_concurrency.lockutils [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.667 187156 DEBUG oslo_concurrency.lockutils [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.667 187156 DEBUG oslo_concurrency.lockutils [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.668 187156 DEBUG nova.compute.manager [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:16 np0005539504 nova_compute[187152]: 2025-11-29 07:49:16.668 187156 WARNING nova.compute.manager [req-d027d2a9-f415-4aca-8013-1b11c8c10894 req-15af0bd0-85e8-4313-b80b-0ff9b283a42f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:49:17 np0005539504 nova_compute[187152]: 2025-11-29 07:49:17.332 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:17 np0005539504 podman[249175]: 2025-11-29 07:49:17.71672394 +0000 UTC m=+0.055957942 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 02:49:17 np0005539504 nova_compute[187152]: 2025-11-29 07:49:17.844 187156 DEBUG nova.network.neutron [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updated VIF entry in instance network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:49:17 np0005539504 nova_compute[187152]: 2025-11-29 07:49:17.845 187156 DEBUG nova.network.neutron [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:17 np0005539504 nova_compute[187152]: 2025-11-29 07:49:17.867 187156 DEBUG oslo_concurrency.lockutils [req-0c208187-28a2-4fa6-845b-26b0b5cf509c req-2bbe843f-d9a0-4e57-96ab-c4b843339993 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:17 np0005539504 nova_compute[187152]: 2025-11-29 07:49:17.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:18 np0005539504 nova_compute[187152]: 2025-11-29 07:49:18.073 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.651 187156 DEBUG nova.compute.manager [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.651 187156 DEBUG nova.compute.manager [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing instance network info cache due to event network-changed-9f78664f-38e8-4b5a-88ff-6a8cfef3f939. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.652 187156 DEBUG oslo_concurrency.lockutils [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.652 187156 DEBUG oslo_concurrency.lockutils [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.652 187156 DEBUG nova.network.neutron [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Refreshing network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.753 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.754 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.754 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.754 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.755 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.766 187156 INFO nova.compute.manager [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Terminating instance#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.775 187156 DEBUG nova.compute.manager [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:49:20 np0005539504 kernel: tap9f78664f-38 (unregistering): left promiscuous mode
Nov 29 02:49:20 np0005539504 NetworkManager[55210]: <info>  [1764402560.7994] device (tap9f78664f-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.812 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:20Z|00697|binding|INFO|Releasing lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 from this chassis (sb_readonly=0)
Nov 29 02:49:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:20Z|00698|binding|INFO|Setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 down in Southbound
Nov 29 02:49:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:20Z|00699|binding|INFO|Removing iface tap9f78664f-38 ovn-installed in OVS
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:20.821 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:42 10.100.0.7'], port_security=['fa:16:3e:95:d5:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55ec2953-d749-48e9-8cb3-cdbf368690fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8074c60c-bc9e-40bb-8493-fc40fe113e9f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=9f78664f-38e8-4b5a-88ff-6a8cfef3f939) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:20.823 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 in datapath cfd1ce3c-e516-46ef-8712-573fe4de8313 unbound from our chassis#033[00m
Nov 29 02:49:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:20.825 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfd1ce3c-e516-46ef-8712-573fe4de8313, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:49:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:20.826 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[374a6ba0-e0f3-4ce3-8db8-d6caacb697b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:20.828 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 namespace which is not needed anymore#033[00m
Nov 29 02:49:20 np0005539504 nova_compute[187152]: 2025-11-29 07:49:20.831 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:20 np0005539504 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Nov 29 02:49:20 np0005539504 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a9.scope: Consumed 14.988s CPU time.
Nov 29 02:49:20 np0005539504 systemd-machined[153423]: Machine qemu-87-instance-000000a9 terminated.
Nov 29 02:49:20 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [NOTICE]   (248622) : haproxy version is 2.8.14-c23fe91
Nov 29 02:49:20 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [NOTICE]   (248622) : path to executable is /usr/sbin/haproxy
Nov 29 02:49:20 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [WARNING]  (248622) : Exiting Master process...
Nov 29 02:49:20 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [ALERT]    (248622) : Current worker (248624) exited with code 143 (Terminated)
Nov 29 02:49:20 np0005539504 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[248618]: [WARNING]  (248622) : All workers exited. Exiting... (0)
Nov 29 02:49:20 np0005539504 systemd[1]: libpod-a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19.scope: Deactivated successfully.
Nov 29 02:49:20 np0005539504 podman[249223]: 2025-11-29 07:49:20.988964683 +0000 UTC m=+0.054001578 container died a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:49:21 np0005539504 kernel: tap9f78664f-38: entered promiscuous mode
Nov 29 02:49:21 np0005539504 systemd-udevd[249203]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00700|binding|INFO|Claiming lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for this chassis.
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00701|binding|INFO|9f78664f-38e8-4b5a-88ff-6a8cfef3f939: Claiming fa:16:3e:95:d5:42 10.100.0.7
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.002 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 NetworkManager[55210]: <info>  [1764402561.0035] manager: (tap9f78664f-38): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Nov 29 02:49:21 np0005539504 kernel: tap9f78664f-38 (unregistering): left promiscuous mode
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.015 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:42 10.100.0.7'], port_security=['fa:16:3e:95:d5:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55ec2953-d749-48e9-8cb3-cdbf368690fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8074c60c-bc9e-40bb-8493-fc40fe113e9f, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=9f78664f-38e8-4b5a-88ff-6a8cfef3f939) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00702|binding|INFO|Setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 ovn-installed in OVS
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00703|binding|INFO|Setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 up in Southbound
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.023 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00704|binding|INFO|Releasing lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 from this chassis (sb_readonly=1)
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00705|if_status|INFO|Dropped 2 log messages in last 2752 seconds (most recently, 2752 seconds ago) due to excessive rate
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00706|if_status|INFO|Not setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 down as sb is readonly
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00707|binding|INFO|Removing iface tap9f78664f-38 ovn-installed in OVS
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00708|binding|INFO|Releasing lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 from this chassis (sb_readonly=0)
Nov 29 02:49:21 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:21Z|00709|binding|INFO|Setting lport 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 down in Southbound
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.036 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:42 10.100.0.7'], port_security=['fa:16:3e:95:d5:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f0ee4a50-ced9-4062-8d91-0296cc909668', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '55ec2953-d749-48e9-8cb3-cdbf368690fc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8074c60c-bc9e-40bb-8493-fc40fe113e9f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=9f78664f-38e8-4b5a-88ff-6a8cfef3f939) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:21 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19-userdata-shm.mount: Deactivated successfully.
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.043 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 systemd[1]: var-lib-containers-storage-overlay-ea0edd2b94d1f1ac6c23be3444fd023ea59d62b90aff33329b42952da3aa5089-merged.mount: Deactivated successfully.
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.052 187156 INFO nova.virt.libvirt.driver [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Instance destroyed successfully.#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.053 187156 DEBUG nova.objects.instance [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid f0ee4a50-ced9-4062-8d91-0296cc909668 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:49:21 np0005539504 podman[249223]: 2025-11-29 07:49:21.058117631 +0000 UTC m=+0.123154526 container cleanup a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.061 187156 DEBUG nova.compute.manager [req-f2936c67-90f6-4fcd-a455-7fe8774fbe73 req-27d1d1aa-b98b-465d-be10-5ba7f2846e38 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-unplugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.062 187156 DEBUG oslo_concurrency.lockutils [req-f2936c67-90f6-4fcd-a455-7fe8774fbe73 req-27d1d1aa-b98b-465d-be10-5ba7f2846e38 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.062 187156 DEBUG oslo_concurrency.lockutils [req-f2936c67-90f6-4fcd-a455-7fe8774fbe73 req-27d1d1aa-b98b-465d-be10-5ba7f2846e38 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.062 187156 DEBUG oslo_concurrency.lockutils [req-f2936c67-90f6-4fcd-a455-7fe8774fbe73 req-27d1d1aa-b98b-465d-be10-5ba7f2846e38 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.063 187156 DEBUG nova.compute.manager [req-f2936c67-90f6-4fcd-a455-7fe8774fbe73 req-27d1d1aa-b98b-465d-be10-5ba7f2846e38 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-unplugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.063 187156 DEBUG nova.compute.manager [req-f2936c67-90f6-4fcd-a455-7fe8774fbe73 req-27d1d1aa-b98b-465d-be10-5ba7f2846e38 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-unplugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:49:21 np0005539504 systemd[1]: libpod-conmon-a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19.scope: Deactivated successfully.
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.073 187156 DEBUG nova.virt.libvirt.vif [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1803258578',display_name='tempest-TestNetworkBasicOps-server-1803258578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1803258578',id=169,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHQFgudm9Fpu3rBil/k2VkBHgHTj3fslbCsk41lFkMhvsi/C//pKxMH46aCINXEl/8zMWUm+/D3mt2EQV5xHuzB03FYfe3kSmhBufzsVePZca2PKydsIMvRpwaWeJ1Ka3w==',key_name='tempest-TestNetworkBasicOps-1348264139',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:48:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-l0bda32k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:48:20Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f0ee4a50-ced9-4062-8d91-0296cc909668,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.074 187156 DEBUG nova.network.os_vif_util [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.075 187156 DEBUG nova.network.os_vif_util [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.076 187156 DEBUG os_vif [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.079 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.080 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f78664f-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.082 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.083 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.086 187156 INFO os_vif [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:d5:42,bridge_name='br-int',has_traffic_filtering=True,id=9f78664f-38e8-4b5a-88ff-6a8cfef3f939,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f78664f-38')#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.086 187156 INFO nova.virt.libvirt.driver [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Deleting instance files /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668_del#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.087 187156 INFO nova.virt.libvirt.driver [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Deletion of /var/lib/nova/instances/f0ee4a50-ced9-4062-8d91-0296cc909668_del complete#033[00m
Nov 29 02:49:21 np0005539504 podman[249265]: 2025-11-29 07:49:21.135681004 +0000 UTC m=+0.046143406 container remove a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.141 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[b11e7cd1-47dd-4fc0-ad7b-c85eeac19f5b]: (4, ('Sat Nov 29 07:49:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 (a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19)\na80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19\nSat Nov 29 07:49:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 (a80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19)\na80d3dc5f4731832e46b0e0b91d759ad8daba32848209bc618155b608c4cff19\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.142 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4175a7b2-76c9-4bf0-95aa-b96910718bd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.143 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfd1ce3c-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.145 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 kernel: tapcfd1ce3c-e0: left promiscuous mode
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.157 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.160 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[754bf4c2-0134-4304-98b2-ce432159c3a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.169 187156 INFO nova.compute.manager [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.170 187156 DEBUG oslo.service.loopingcall [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.170 187156 DEBUG nova.compute.manager [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.170 187156 DEBUG nova.network.neutron [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.182 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c324840a-7302-43f3-81cd-9c4e9075275a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.183 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e64e8d-51e4-4230-ad33-f6ab617bfee6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.200 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ed53b90c-083c-4321-907c-375779ccd0a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796331, 'reachable_time': 39367, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249280, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.202 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:49:21 np0005539504 systemd[1]: run-netns-ovnmeta\x2dcfd1ce3c\x2de516\x2d46ef\x2d8712\x2d573fe4de8313.mount: Deactivated successfully.
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.202 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0cf710-d6a0-4153-9075-94d54dc7ad30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.204 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 in datapath cfd1ce3c-e516-46ef-8712-573fe4de8313 unbound from our chassis#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.205 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfd1ce3c-e516-46ef-8712-573fe4de8313, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.205 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e41b853f-4e72-4f33-b047-ffbd5098669b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.206 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939 in datapath cfd1ce3c-e516-46ef-8712-573fe4de8313 unbound from our chassis#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.207 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfd1ce3c-e516-46ef-8712-573fe4de8313, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:49:21 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:21.207 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0cf091-f529-48fa-ab41-a8ec4e480615]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.694 187156 DEBUG nova.network.neutron [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.713 187156 INFO nova.compute.manager [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Took 0.54 seconds to deallocate network for instance.#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.782 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.783 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.853 187156 DEBUG nova.compute.provider_tree [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.881 187156 DEBUG nova.scheduler.client.report [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.915 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:21 np0005539504 nova_compute[187152]: 2025-11-29 07:49:21.949 187156 INFO nova.scheduler.client.report [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance f0ee4a50-ced9-4062-8d91-0296cc909668#033[00m
Nov 29 02:49:22 np0005539504 nova_compute[187152]: 2025-11-29 07:49:22.031 187156 DEBUG oslo_concurrency.lockutils [None req-0359c870-e3ba-45ab-a0d2-8a3f17f19ade 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:22 np0005539504 nova_compute[187152]: 2025-11-29 07:49:22.302 187156 DEBUG nova.network.neutron [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updated VIF entry in instance network info cache for port 9f78664f-38e8-4b5a-88ff-6a8cfef3f939. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:49:22 np0005539504 nova_compute[187152]: 2025-11-29 07:49:22.303 187156 DEBUG nova.network.neutron [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Updating instance_info_cache with network_info: [{"id": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "address": "fa:16:3e:95:d5:42", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f78664f-38", "ovs_interfaceid": "9f78664f-38e8-4b5a-88ff-6a8cfef3f939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:22 np0005539504 nova_compute[187152]: 2025-11-29 07:49:22.326 187156 DEBUG oslo_concurrency.lockutils [req-eb70a3e1-d32a-42d3-a264-0fc0029a1999 req-98b9e04d-d4ca-4919-8ea7-e5c72c0000de 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f0ee4a50-ced9-4062-8d91-0296cc909668" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:22 np0005539504 nova_compute[187152]: 2025-11-29 07:49:22.762 187156 DEBUG nova.compute.manager [req-9307d0cd-4044-4ae0-bea3-28f2c16cf4eb req-73423256-fad0-4af6-92e9-95d3a8abf771 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-deleted-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:22 np0005539504 nova_compute[187152]: 2025-11-29 07:49:22.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.075 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.168 187156 DEBUG nova.compute.manager [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.168 187156 DEBUG oslo_concurrency.lockutils [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.169 187156 DEBUG oslo_concurrency.lockutils [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.169 187156 DEBUG oslo_concurrency.lockutils [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.170 187156 DEBUG nova.compute.manager [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.170 187156 WARNING nova.compute.manager [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.170 187156 DEBUG nova.compute.manager [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.171 187156 DEBUG oslo_concurrency.lockutils [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.171 187156 DEBUG oslo_concurrency.lockutils [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.172 187156 DEBUG oslo_concurrency.lockutils [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f0ee4a50-ced9-4062-8d91-0296cc909668-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.172 187156 DEBUG nova.compute.manager [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] No waiting events found dispatching network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.173 187156 WARNING nova.compute.manager [req-8f7a12fd-482d-4998-b60c-53dd52988423 req-dce76648-e568-4477-b936-6ea8b87a8e9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Received unexpected event network-vif-plugged-9f78664f-38e8-4b5a-88ff-6a8cfef3f939 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:49:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:23.486 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:23.487 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:23.487 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:23 np0005539504 nova_compute[187152]: 2025-11-29 07:49:23.671 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:25 np0005539504 nova_compute[187152]: 2025-11-29 07:49:25.538 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:25 np0005539504 nova_compute[187152]: 2025-11-29 07:49:25.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:26 np0005539504 nova_compute[187152]: 2025-11-29 07:49:26.120 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:28 np0005539504 nova_compute[187152]: 2025-11-29 07:49:28.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:29 np0005539504 podman[249284]: 2025-11-29 07:49:29.718507903 +0000 UTC m=+0.050235107 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:49:29 np0005539504 podman[249283]: 2025-11-29 07:49:29.721471253 +0000 UTC m=+0.057728589 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64)
Nov 29 02:49:29 np0005539504 podman[249282]: 2025-11-29 07:49:29.744241348 +0000 UTC m=+0.080046482 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:49:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:30.676 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:30.677 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:49:30 np0005539504 nova_compute[187152]: 2025-11-29 07:49:30.730 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:30 np0005539504 nova_compute[187152]: 2025-11-29 07:49:30.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:30 np0005539504 nova_compute[187152]: 2025-11-29 07:49:30.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:30 np0005539504 nova_compute[187152]: 2025-11-29 07:49:30.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:49:31 np0005539504 nova_compute[187152]: 2025-11-29 07:49:31.122 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:31 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:31.678 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:33 np0005539504 nova_compute[187152]: 2025-11-29 07:49:33.078 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:34 np0005539504 podman[249344]: 2025-11-29 07:49:34.725226645 +0000 UTC m=+0.067417890 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:49:34 np0005539504 podman[249345]: 2025-11-29 07:49:34.751730491 +0000 UTC m=+0.094047470 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller)
Nov 29 02:49:34 np0005539504 nova_compute[187152]: 2025-11-29 07:49:34.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:34 np0005539504 nova_compute[187152]: 2025-11-29 07:49:34.981 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:34 np0005539504 nova_compute[187152]: 2025-11-29 07:49:34.983 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:34 np0005539504 nova_compute[187152]: 2025-11-29 07:49:34.983 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:34 np0005539504 nova_compute[187152]: 2025-11-29 07:49:34.984 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.146 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.148 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5703MB free_disk=73.00629806518555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.148 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.148 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.452 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.453 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.475 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.489 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.508 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:49:35 np0005539504 nova_compute[187152]: 2025-11-29 07:49:35.509 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:36 np0005539504 nova_compute[187152]: 2025-11-29 07:49:36.051 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402561.0498962, f0ee4a50-ced9-4062-8d91-0296cc909668 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:36 np0005539504 nova_compute[187152]: 2025-11-29 07:49:36.052 187156 INFO nova.compute.manager [-] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:49:36 np0005539504 nova_compute[187152]: 2025-11-29 07:49:36.073 187156 DEBUG nova.compute.manager [None req-b961a250-73ef-47b9-a819-8b0aa6fae25c - - - - - -] [instance: f0ee4a50-ced9-4062-8d91-0296cc909668] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:36 np0005539504 nova_compute[187152]: 2025-11-29 07:49:36.124 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:38 np0005539504 nova_compute[187152]: 2025-11-29 07:49:38.080 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:38 np0005539504 nova_compute[187152]: 2025-11-29 07:49:38.509 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:38 np0005539504 nova_compute[187152]: 2025-11-29 07:49:38.510 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:49:38 np0005539504 nova_compute[187152]: 2025-11-29 07:49:38.510 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:49:38 np0005539504 nova_compute[187152]: 2025-11-29 07:49:38.534 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:49:38 np0005539504 nova_compute[187152]: 2025-11-29 07:49:38.534 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:39 np0005539504 nova_compute[187152]: 2025-11-29 07:49:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:41 np0005539504 nova_compute[187152]: 2025-11-29 07:49:41.127 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:42 np0005539504 podman[249397]: 2025-11-29 07:49:42.7593107 +0000 UTC m=+0.097399251 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:49:43 np0005539504 nova_compute[187152]: 2025-11-29 07:49:43.082 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.443 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.443 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.459 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.554 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.555 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.560 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.560 187156 INFO nova.compute.claims [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.672 187156 DEBUG nova.compute.provider_tree [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.687 187156 DEBUG nova.scheduler.client.report [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.704 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.705 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.757 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.757 187156 DEBUG nova.network.neutron [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.776 187156 INFO nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.796 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.923 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.925 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.925 187156 INFO nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Creating image(s)#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.926 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.927 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.928 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:44 np0005539504 nova_compute[187152]: 2025-11-29 07:49:44.953 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.015 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.016 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.017 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.028 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.066 187156 DEBUG nova.policy [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.085 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.086 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.120 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.121 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.122 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.178 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.179 187156 DEBUG nova.virt.disk.api [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.179 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.233 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.234 187156 DEBUG nova.virt.disk.api [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.234 187156 DEBUG nova.objects.instance [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 72880d73-5d71-455b-9281-b171fb1d024f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.284 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.284 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Ensure instance console log exists: /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.285 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.285 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.285 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:45 np0005539504 nova_compute[187152]: 2025-11-29 07:49:45.956 187156 DEBUG nova.network.neutron [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Successfully created port: 16299523-f464-4264-a7c7-83d865feeef5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:49:46 np0005539504 nova_compute[187152]: 2025-11-29 07:49:46.133 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:46 np0005539504 nova_compute[187152]: 2025-11-29 07:49:46.943 187156 DEBUG nova.network.neutron [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Successfully updated port: 16299523-f464-4264-a7c7-83d865feeef5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:49:46 np0005539504 nova_compute[187152]: 2025-11-29 07:49:46.970 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:46 np0005539504 nova_compute[187152]: 2025-11-29 07:49:46.971 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:46 np0005539504 nova_compute[187152]: 2025-11-29 07:49:46.971 187156 DEBUG nova.network.neutron [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:49:47 np0005539504 nova_compute[187152]: 2025-11-29 07:49:47.062 187156 DEBUG nova.compute.manager [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-changed-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:47 np0005539504 nova_compute[187152]: 2025-11-29 07:49:47.063 187156 DEBUG nova.compute.manager [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Refreshing instance network info cache due to event network-changed-16299523-f464-4264-a7c7-83d865feeef5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:49:47 np0005539504 nova_compute[187152]: 2025-11-29 07:49:47.063 187156 DEBUG oslo_concurrency.lockutils [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:47 np0005539504 nova_compute[187152]: 2025-11-29 07:49:47.136 187156 DEBUG nova.network.neutron [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:49:47 np0005539504 nova_compute[187152]: 2025-11-29 07:49:47.914 187156 DEBUG nova.network.neutron [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updating instance_info_cache with network_info: [{"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.086 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.118 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.119 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Instance network_info: |[{"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.119 187156 DEBUG oslo_concurrency.lockutils [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.119 187156 DEBUG nova.network.neutron [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Refreshing network info cache for port 16299523-f464-4264-a7c7-83d865feeef5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.123 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Start _get_guest_xml network_info=[{"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.127 187156 WARNING nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.133 187156 DEBUG nova.virt.libvirt.host [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.135 187156 DEBUG nova.virt.libvirt.host [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.150 187156 DEBUG nova.virt.libvirt.host [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.152 187156 DEBUG nova.virt.libvirt.host [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.153 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.153 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.154 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.154 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.155 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.155 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.155 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.155 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.156 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.156 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.156 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.157 187156 DEBUG nova.virt.hardware [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.162 187156 DEBUG nova.virt.libvirt.vif [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:49:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2135744245',display_name='tempest-TestNetworkBasicOps-server-2135744245',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2135744245',id=174,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKG7zaIO0Y0Pjj6AHTqeR3duYBB0twuZjMKL8D62oeLhfNkWHHgjzaa3DkEv+i5/8a4USUSDMH9L33olE2Lsp2Y3MxwBDCp6Ba7Jg+FTu7hQTGlHhW25NWEhBtHoxEzHPQ==',key_name='tempest-TestNetworkBasicOps-351323551',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-3s1iw0gn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:49:44Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=72880d73-5d71-455b-9281-b171fb1d024f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.162 187156 DEBUG nova.network.os_vif_util [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.163 187156 DEBUG nova.network.os_vif_util [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.164 187156 DEBUG nova.objects.instance [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 72880d73-5d71-455b-9281-b171fb1d024f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.304 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <uuid>72880d73-5d71-455b-9281-b171fb1d024f</uuid>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <name>instance-000000ae</name>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestNetworkBasicOps-server-2135744245</nova:name>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:49:48</nova:creationTime>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        <nova:port uuid="16299523-f464-4264-a7c7-83d865feeef5">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <entry name="serial">72880d73-5d71-455b-9281-b171fb1d024f</entry>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <entry name="uuid">72880d73-5d71-455b-9281-b171fb1d024f</entry>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.config"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:24:5c:93"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <target dev="tap16299523-f4"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/console.log" append="off"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:49:48 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:49:48 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:49:48 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:49:48 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.307 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Preparing to wait for external event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.308 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.309 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.309 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.311 187156 DEBUG nova.virt.libvirt.vif [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:49:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2135744245',display_name='tempest-TestNetworkBasicOps-server-2135744245',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2135744245',id=174,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKG7zaIO0Y0Pjj6AHTqeR3duYBB0twuZjMKL8D62oeLhfNkWHHgjzaa3DkEv+i5/8a4USUSDMH9L33olE2Lsp2Y3MxwBDCp6Ba7Jg+FTu7hQTGlHhW25NWEhBtHoxEzHPQ==',key_name='tempest-TestNetworkBasicOps-351323551',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-3s1iw0gn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:49:44Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=72880d73-5d71-455b-9281-b171fb1d024f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.311 187156 DEBUG nova.network.os_vif_util [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.313 187156 DEBUG nova.network.os_vif_util [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.313 187156 DEBUG os_vif [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.314 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.315 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.316 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.322 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.322 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16299523-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.323 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap16299523-f4, col_values=(('external_ids', {'iface-id': '16299523-f464-4264-a7c7-83d865feeef5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:5c:93', 'vm-uuid': '72880d73-5d71-455b-9281-b171fb1d024f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:48 np0005539504 NetworkManager[55210]: <info>  [1764402588.3278] manager: (tap16299523-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.330 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.335 187156 INFO os_vif [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4')#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.528 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.529 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.529 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:24:5c:93, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.529 187156 INFO nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Using config drive#033[00m
Nov 29 02:49:48 np0005539504 podman[249435]: 2025-11-29 07:49:48.71791141 +0000 UTC m=+0.061789219 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:49:48 np0005539504 nova_compute[187152]: 2025-11-29 07:49:48.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:50 np0005539504 nova_compute[187152]: 2025-11-29 07:49:50.820 187156 INFO nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Creating config drive at /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.config#033[00m
Nov 29 02:49:50 np0005539504 nova_compute[187152]: 2025-11-29 07:49:50.832 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9xe30f6t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:50 np0005539504 nova_compute[187152]: 2025-11-29 07:49:50.959 187156 DEBUG oslo_concurrency.processutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9xe30f6t" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:51 np0005539504 kernel: tap16299523-f4: entered promiscuous mode
Nov 29 02:49:51 np0005539504 NetworkManager[55210]: <info>  [1764402591.0178] manager: (tap16299523-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Nov 29 02:49:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:51Z|00710|binding|INFO|Claiming lport 16299523-f464-4264-a7c7-83d865feeef5 for this chassis.
Nov 29 02:49:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:51Z|00711|binding|INFO|16299523-f464-4264-a7c7-83d865feeef5: Claiming fa:16:3e:24:5c:93 10.100.0.6
Nov 29 02:49:51 np0005539504 nova_compute[187152]: 2025-11-29 07:49:51.019 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539504 nova_compute[187152]: 2025-11-29 07:49:51.022 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.032 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:5c:93 10.100.0.6'], port_security=['fa:16:3e:24:5c:93 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '72880d73-5d71-455b-9281-b171fb1d024f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07fde1dd-2f30-472c-9b6e-371dfcee3d55', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=603e5ea6-e1b6-46cc-88d5-ab46cc34ab55, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=16299523-f464-4264-a7c7-83d865feeef5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.034 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 16299523-f464-4264-a7c7-83d865feeef5 in datapath ecf9dfa5-3869-4794-b3ab-c2616cd59392 bound to our chassis#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.036 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ecf9dfa5-3869-4794-b3ab-c2616cd59392#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.046 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[57722d43-8b40-402f-bd46-8036769db482]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.047 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapecf9dfa5-31 in ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:49:51 np0005539504 systemd-udevd[249472]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.048 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapecf9dfa5-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.048 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dcd71a-2798-4189-9b88-4539e4aa2e32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.049 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd9df1f-4ef6-4e90-a989-9586c8d7e300]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 NetworkManager[55210]: <info>  [1764402591.0619] device (tap16299523-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:49:51 np0005539504 NetworkManager[55210]: <info>  [1764402591.0631] device (tap16299523-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.062 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[8253314f-f5a7-4a47-9bc7-8467b0942e06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 systemd-machined[153423]: New machine qemu-89-instance-000000ae.
Nov 29 02:49:51 np0005539504 nova_compute[187152]: 2025-11-29 07:49:51.078 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:51Z|00712|binding|INFO|Setting lport 16299523-f464-4264-a7c7-83d865feeef5 ovn-installed in OVS
Nov 29 02:49:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:51Z|00713|binding|INFO|Setting lport 16299523-f464-4264-a7c7-83d865feeef5 up in Southbound
Nov 29 02:49:51 np0005539504 systemd[1]: Started Virtual Machine qemu-89-instance-000000ae.
Nov 29 02:49:51 np0005539504 nova_compute[187152]: 2025-11-29 07:49:51.084 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.086 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[892ab4e4-6c37-427d-8783-b126977b656f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.115 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d727fd-0c3f-43d0-838d-aa6a4c00b3e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.120 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd9a58f-3a4b-4199-b4ea-e9e57b9ffdde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 NetworkManager[55210]: <info>  [1764402591.1219] manager: (tapecf9dfa5-30): new Veth device (/org/freedesktop/NetworkManager/Devices/317)
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.150 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[bec75e11-e22c-48fd-8c9b-a54f2f675a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.154 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[257e6b2f-21a5-4b07-95e7-39be0f048fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 NetworkManager[55210]: <info>  [1764402591.1825] device (tapecf9dfa5-30): carrier: link connected
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.189 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[16ce19d2-4295-49f8-930a-f763a3e3ce3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.209 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8b3ac1-e981-4f37-adaa-0ecb233e18fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapecf9dfa5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:4e:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 805406, 'reachable_time': 20861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249506, 'error': None, 'target': 'ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.223 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3d5861-be47-4f06-a2e3-905ef4500991]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:4e75'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 805406, 'tstamp': 805406}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249507, 'error': None, 'target': 'ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.246 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[381838ab-7bc9-47fb-8888-aeea7ee7f11e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapecf9dfa5-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:4e:75'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 212], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 805406, 'reachable_time': 20861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249508, 'error': None, 'target': 'ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.282 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c4dc7c82-8c27-47c8-9668-8241c1b63cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.345 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a2e951-464e-4ff7-afd5-f49bb84bd9c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.349 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapecf9dfa5-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.349 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.350 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapecf9dfa5-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:51 np0005539504 NetworkManager[55210]: <info>  [1764402591.3527] manager: (tapecf9dfa5-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Nov 29 02:49:51 np0005539504 nova_compute[187152]: 2025-11-29 07:49:51.352 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539504 kernel: tapecf9dfa5-30: entered promiscuous mode
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.354 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapecf9dfa5-30, col_values=(('external_ids', {'iface-id': 'd0d4f8b4-3c4f-461b-a755-141d05ad5573'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:51 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:51Z|00714|binding|INFO|Releasing lport d0d4f8b4-3c4f-461b-a755-141d05ad5573 from this chassis (sb_readonly=0)
Nov 29 02:49:51 np0005539504 nova_compute[187152]: 2025-11-29 07:49:51.367 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.367 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ecf9dfa5-3869-4794-b3ab-c2616cd59392.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ecf9dfa5-3869-4794-b3ab-c2616cd59392.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.368 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[497f284d-12d4-4d41-be8b-90ad92f6f424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.369 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-ecf9dfa5-3869-4794-b3ab-c2616cd59392
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/ecf9dfa5-3869-4794-b3ab-c2616cd59392.pid.haproxy
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID ecf9dfa5-3869-4794-b3ab-c2616cd59392
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:49:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:49:51.369 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'env', 'PROCESS_TAG=haproxy-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ecf9dfa5-3869-4794-b3ab-c2616cd59392.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:49:51 np0005539504 podman[249538]: 2025-11-29 07:49:51.691494761 +0000 UTC m=+0.021223304 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:49:52 np0005539504 podman[249538]: 2025-11-29 07:49:52.214053639 +0000 UTC m=+0.543782172 container create 80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.236 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402592.2357917, 72880d73-5d71-455b-9281-b171fb1d024f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.237 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] VM Started (Lifecycle Event)#033[00m
Nov 29 02:49:52 np0005539504 systemd[1]: Started libpod-conmon-80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4.scope.
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.260 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.264 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402592.2359984, 72880d73-5d71-455b-9281-b171fb1d024f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.265 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:49:52 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.283 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:52 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b593f0a33a5e858c09f11822dd24b1fc728fcda038d4e157277ea6f25c6231a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.293 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:49:52 np0005539504 podman[249538]: 2025-11-29 07:49:52.298229102 +0000 UTC m=+0.627957665 container init 80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:49:52 np0005539504 podman[249538]: 2025-11-29 07:49:52.30372561 +0000 UTC m=+0.633454163 container start 80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:49:52 np0005539504 nova_compute[187152]: 2025-11-29 07:49:52.311 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:49:52 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [NOTICE]   (249564) : New worker (249566) forked
Nov 29 02:49:52 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [NOTICE]   (249564) : Loading success.
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.000 187156 DEBUG nova.compute.manager [req-308c1d9e-29c4-4dd8-81a6-0ee332066a7b req-550be019-6cd0-47f1-a405-99c0db6e1d8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.000 187156 DEBUG oslo_concurrency.lockutils [req-308c1d9e-29c4-4dd8-81a6-0ee332066a7b req-550be019-6cd0-47f1-a405-99c0db6e1d8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.001 187156 DEBUG oslo_concurrency.lockutils [req-308c1d9e-29c4-4dd8-81a6-0ee332066a7b req-550be019-6cd0-47f1-a405-99c0db6e1d8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.001 187156 DEBUG oslo_concurrency.lockutils [req-308c1d9e-29c4-4dd8-81a6-0ee332066a7b req-550be019-6cd0-47f1-a405-99c0db6e1d8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.001 187156 DEBUG nova.compute.manager [req-308c1d9e-29c4-4dd8-81a6-0ee332066a7b req-550be019-6cd0-47f1-a405-99c0db6e1d8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Processing event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.002 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.005 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402593.0053575, 72880d73-5d71-455b-9281-b171fb1d024f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.005 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.008 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.011 187156 INFO nova.virt.libvirt.driver [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Instance spawned successfully.#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.012 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.024 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.030 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.032 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.033 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.033 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.033 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.034 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.034 187156 DEBUG nova.virt.libvirt.driver [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.061 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.121 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.145 187156 INFO nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Took 8.22 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.146 187156 DEBUG nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.170 187156 DEBUG nova.network.neutron [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updated VIF entry in instance network info cache for port 16299523-f464-4264-a7c7-83d865feeef5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.171 187156 DEBUG nova.network.neutron [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updating instance_info_cache with network_info: [{"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.202 187156 DEBUG oslo_concurrency.lockutils [req-a1c45119-9265-4256-9f13-8de572e66de8 req-a318b45c-0b02-4e16-a8cd-b759ebb2ab2d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.251 187156 INFO nova.compute.manager [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Took 8.73 seconds to build instance.#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.273 187156 DEBUG oslo_concurrency.lockutils [None req-1df0567e-4f78-4889-a094-c91700fb279b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:53 np0005539504 nova_compute[187152]: 2025-11-29 07:49:53.326 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:55 np0005539504 nova_compute[187152]: 2025-11-29 07:49:55.201 187156 DEBUG nova.compute.manager [req-dc73678e-57d5-41e4-b0c7-1166a4428d9c req-4947d0d7-f61a-4cfa-a671-c93fbc69f685 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:55 np0005539504 nova_compute[187152]: 2025-11-29 07:49:55.201 187156 DEBUG oslo_concurrency.lockutils [req-dc73678e-57d5-41e4-b0c7-1166a4428d9c req-4947d0d7-f61a-4cfa-a671-c93fbc69f685 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:55 np0005539504 nova_compute[187152]: 2025-11-29 07:49:55.201 187156 DEBUG oslo_concurrency.lockutils [req-dc73678e-57d5-41e4-b0c7-1166a4428d9c req-4947d0d7-f61a-4cfa-a671-c93fbc69f685 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:55 np0005539504 nova_compute[187152]: 2025-11-29 07:49:55.202 187156 DEBUG oslo_concurrency.lockutils [req-dc73678e-57d5-41e4-b0c7-1166a4428d9c req-4947d0d7-f61a-4cfa-a671-c93fbc69f685 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:55 np0005539504 nova_compute[187152]: 2025-11-29 07:49:55.202 187156 DEBUG nova.compute.manager [req-dc73678e-57d5-41e4-b0c7-1166a4428d9c req-4947d0d7-f61a-4cfa-a671-c93fbc69f685 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] No waiting events found dispatching network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:55 np0005539504 nova_compute[187152]: 2025-11-29 07:49:55.202 187156 WARNING nova.compute.manager [req-dc73678e-57d5-41e4-b0c7-1166a4428d9c req-4947d0d7-f61a-4cfa-a671-c93fbc69f685 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received unexpected event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:49:57 np0005539504 nova_compute[187152]: 2025-11-29 07:49:57.932 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539504 NetworkManager[55210]: <info>  [1764402597.9371] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Nov 29 02:49:57 np0005539504 NetworkManager[55210]: <info>  [1764402597.9384] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.122 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.126 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:58 np0005539504 ovn_controller[95182]: 2025-11-29T07:49:58Z|00715|binding|INFO|Releasing lport d0d4f8b4-3c4f-461b-a755-141d05ad5573 from this chassis (sb_readonly=0)
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.145 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.327 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.552 187156 DEBUG nova.compute.manager [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-changed-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.553 187156 DEBUG nova.compute.manager [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Refreshing instance network info cache due to event network-changed-16299523-f464-4264-a7c7-83d865feeef5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.553 187156 DEBUG oslo_concurrency.lockutils [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.553 187156 DEBUG oslo_concurrency.lockutils [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:58 np0005539504 nova_compute[187152]: 2025-11-29 07:49:58.554 187156 DEBUG nova.network.neutron [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Refreshing network info cache for port 16299523-f464-4264-a7c7-83d865feeef5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:50:00 np0005539504 nova_compute[187152]: 2025-11-29 07:50:00.138 187156 DEBUG nova.network.neutron [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updated VIF entry in instance network info cache for port 16299523-f464-4264-a7c7-83d865feeef5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:50:00 np0005539504 nova_compute[187152]: 2025-11-29 07:50:00.139 187156 DEBUG nova.network.neutron [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updating instance_info_cache with network_info: [{"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:00 np0005539504 nova_compute[187152]: 2025-11-29 07:50:00.164 187156 DEBUG oslo_concurrency.lockutils [req-c6110c97-862c-43e2-bdcc-87d14cb760cb req-eca0af04-9794-4c31-b873-0572fc761a6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:00 np0005539504 podman[249578]: 2025-11-29 07:50:00.710679441 +0000 UTC m=+0.049196769 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:50:00 np0005539504 podman[249577]: 2025-11-29 07:50:00.715979474 +0000 UTC m=+0.059467467 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:50:00 np0005539504 podman[249576]: 2025-11-29 07:50:00.739747846 +0000 UTC m=+0.084025370 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:50:03 np0005539504 nova_compute[187152]: 2025-11-29 07:50:03.126 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:03 np0005539504 nova_compute[187152]: 2025-11-29 07:50:03.332 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:05 np0005539504 podman[249654]: 2025-11-29 07:50:05.725439618 +0000 UTC m=+0.063615638 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:50:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:05Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:5c:93 10.100.0.6
Nov 29 02:50:05 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:05Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:5c:93 10.100.0.6
Nov 29 02:50:05 np0005539504 podman[249655]: 2025-11-29 07:50:05.800155796 +0000 UTC m=+0.119214989 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:50:08 np0005539504 nova_compute[187152]: 2025-11-29 07:50:08.165 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:08 np0005539504 nova_compute[187152]: 2025-11-29 07:50:08.334 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:13 np0005539504 nova_compute[187152]: 2025-11-29 07:50:13.144 187156 INFO nova.compute.manager [None req-2aa4cd9f-f0ff-45ea-ad3a-db5380efa4f6 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Get console output#033[00m
Nov 29 02:50:13 np0005539504 nova_compute[187152]: 2025-11-29 07:50:13.152 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:50:13 np0005539504 nova_compute[187152]: 2025-11-29 07:50:13.166 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:13 np0005539504 nova_compute[187152]: 2025-11-29 07:50:13.336 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:13 np0005539504 podman[249702]: 2025-11-29 07:50:13.712861946 +0000 UTC m=+0.059537888 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:50:14 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:14Z|00716|binding|INFO|Releasing lport d0d4f8b4-3c4f-461b-a755-141d05ad5573 from this chassis (sb_readonly=0)
Nov 29 02:50:14 np0005539504 nova_compute[187152]: 2025-11-29 07:50:14.879 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:15 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:15Z|00717|binding|INFO|Releasing lport d0d4f8b4-3c4f-461b-a755-141d05ad5573 from this chassis (sb_readonly=0)
Nov 29 02:50:15 np0005539504 nova_compute[187152]: 2025-11-29 07:50:15.090 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:17 np0005539504 nova_compute[187152]: 2025-11-29 07:50:17.229 187156 INFO nova.compute.manager [None req-44254892-8e24-4bf1-8b1d-d129bc531ee2 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Get console output#033[00m
Nov 29 02:50:17 np0005539504 nova_compute[187152]: 2025-11-29 07:50:17.235 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.200 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539504 NetworkManager[55210]: <info>  [1764402618.2011] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Nov 29 02:50:18 np0005539504 NetworkManager[55210]: <info>  [1764402618.2016] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.338 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.361 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:18Z|00718|binding|INFO|Releasing lport d0d4f8b4-3c4f-461b-a755-141d05ad5573 from this chassis (sb_readonly=0)
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.386 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.947 187156 INFO nova.compute.manager [None req-0a56505f-b177-426c-b896-a40f8f5507ef 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Get console output#033[00m
Nov 29 02:50:18 np0005539504 nova_compute[187152]: 2025-11-29 07:50:18.952 213702 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:50:19 np0005539504 podman[249726]: 2025-11-29 07:50:19.780329137 +0000 UTC m=+0.110872815 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd)
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.013 187156 DEBUG nova.compute.manager [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-changed-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.013 187156 DEBUG nova.compute.manager [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Refreshing instance network info cache due to event network-changed-16299523-f464-4264-a7c7-83d865feeef5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.014 187156 DEBUG oslo_concurrency.lockutils [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.014 187156 DEBUG oslo_concurrency.lockutils [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.014 187156 DEBUG nova.network.neutron [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Refreshing network info cache for port 16299523-f464-4264-a7c7-83d865feeef5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.117 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.117 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.118 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.118 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.118 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.132 187156 INFO nova.compute.manager [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Terminating instance#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.143 187156 DEBUG nova.compute.manager [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:50:20 np0005539504 kernel: tap16299523-f4 (unregistering): left promiscuous mode
Nov 29 02:50:20 np0005539504 NetworkManager[55210]: <info>  [1764402620.1681] device (tap16299523-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.173 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:20Z|00719|binding|INFO|Releasing lport 16299523-f464-4264-a7c7-83d865feeef5 from this chassis (sb_readonly=0)
Nov 29 02:50:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:20Z|00720|binding|INFO|Setting lport 16299523-f464-4264-a7c7-83d865feeef5 down in Southbound
Nov 29 02:50:20 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:20Z|00721|binding|INFO|Removing iface tap16299523-f4 ovn-installed in OVS
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.174 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.181 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:5c:93 10.100.0.6'], port_security=['fa:16:3e:24:5c:93 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '72880d73-5d71-455b-9281-b171fb1d024f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '07fde1dd-2f30-472c-9b6e-371dfcee3d55', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=603e5ea6-e1b6-46cc-88d5-ab46cc34ab55, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=16299523-f464-4264-a7c7-83d865feeef5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.183 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 16299523-f464-4264-a7c7-83d865feeef5 in datapath ecf9dfa5-3869-4794-b3ab-c2616cd59392 unbound from our chassis#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.185 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ecf9dfa5-3869-4794-b3ab-c2616cd59392, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.187 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[53bda687-3ed1-4515-b452-e8af29807452]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.188 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392 namespace which is not needed anymore#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.191 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Nov 29 02:50:20 np0005539504 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ae.scope: Consumed 14.775s CPU time.
Nov 29 02:50:20 np0005539504 systemd-machined[153423]: Machine qemu-89-instance-000000ae terminated.
Nov 29 02:50:20 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [NOTICE]   (249564) : haproxy version is 2.8.14-c23fe91
Nov 29 02:50:20 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [NOTICE]   (249564) : path to executable is /usr/sbin/haproxy
Nov 29 02:50:20 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [WARNING]  (249564) : Exiting Master process...
Nov 29 02:50:20 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [ALERT]    (249564) : Current worker (249566) exited with code 143 (Terminated)
Nov 29 02:50:20 np0005539504 neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392[249560]: [WARNING]  (249564) : All workers exited. Exiting... (0)
Nov 29 02:50:20 np0005539504 systemd[1]: libpod-80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4.scope: Deactivated successfully.
Nov 29 02:50:20 np0005539504 podman[249768]: 2025-11-29 07:50:20.342606807 +0000 UTC m=+0.043947247 container died 80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.366 187156 DEBUG nova.compute.manager [req-8427cf63-7d59-48bd-91be-5c0e72f1b833 req-d95bdaae-c572-4abb-b813-6977c092a064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-vif-unplugged-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.367 187156 DEBUG oslo_concurrency.lockutils [req-8427cf63-7d59-48bd-91be-5c0e72f1b833 req-d95bdaae-c572-4abb-b813-6977c092a064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.367 187156 DEBUG oslo_concurrency.lockutils [req-8427cf63-7d59-48bd-91be-5c0e72f1b833 req-d95bdaae-c572-4abb-b813-6977c092a064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.367 187156 DEBUG oslo_concurrency.lockutils [req-8427cf63-7d59-48bd-91be-5c0e72f1b833 req-d95bdaae-c572-4abb-b813-6977c092a064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.368 187156 DEBUG nova.compute.manager [req-8427cf63-7d59-48bd-91be-5c0e72f1b833 req-d95bdaae-c572-4abb-b813-6977c092a064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] No waiting events found dispatching network-vif-unplugged-16299523-f464-4264-a7c7-83d865feeef5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:50:20 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4-userdata-shm.mount: Deactivated successfully.
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.368 187156 DEBUG nova.compute.manager [req-8427cf63-7d59-48bd-91be-5c0e72f1b833 req-d95bdaae-c572-4abb-b813-6977c092a064 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-vif-unplugged-16299523-f464-4264-a7c7-83d865feeef5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.369 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 systemd[1]: var-lib-containers-storage-overlay-b593f0a33a5e858c09f11822dd24b1fc728fcda038d4e157277ea6f25c6231a9-merged.mount: Deactivated successfully.
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.378 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 podman[249768]: 2025-11-29 07:50:20.384648992 +0000 UTC m=+0.085989422 container cleanup 80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 02:50:20 np0005539504 systemd[1]: libpod-conmon-80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4.scope: Deactivated successfully.
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.412 187156 INFO nova.virt.libvirt.driver [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Instance destroyed successfully.#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.413 187156 DEBUG nova.objects.instance [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 72880d73-5d71-455b-9281-b171fb1d024f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.428 187156 DEBUG nova.virt.libvirt.vif [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:49:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2135744245',display_name='tempest-TestNetworkBasicOps-server-2135744245',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2135744245',id=174,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKG7zaIO0Y0Pjj6AHTqeR3duYBB0twuZjMKL8D62oeLhfNkWHHgjzaa3DkEv+i5/8a4USUSDMH9L33olE2Lsp2Y3MxwBDCp6Ba7Jg+FTu7hQTGlHhW25NWEhBtHoxEzHPQ==',key_name='tempest-TestNetworkBasicOps-351323551',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:49:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-3s1iw0gn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:49:53Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=72880d73-5d71-455b-9281-b171fb1d024f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.429 187156 DEBUG nova.network.os_vif_util [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.430 187156 DEBUG nova.network.os_vif_util [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.430 187156 DEBUG os_vif [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.433 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16299523-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.436 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.441 187156 INFO os_vif [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:5c:93,bridge_name='br-int',has_traffic_filtering=True,id=16299523-f464-4264-a7c7-83d865feeef5,network=Network(ecf9dfa5-3869-4794-b3ab-c2616cd59392),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16299523-f4')#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.441 187156 INFO nova.virt.libvirt.driver [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Deleting instance files /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f_del#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.442 187156 INFO nova.virt.libvirt.driver [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Deletion of /var/lib/nova/instances/72880d73-5d71-455b-9281-b171fb1d024f_del complete#033[00m
Nov 29 02:50:20 np0005539504 podman[249806]: 2025-11-29 07:50:20.449342069 +0000 UTC m=+0.042397766 container remove 80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.454 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[fb31be16-f428-4bc4-b0fd-9d7fbaef5a87]: (4, ('Sat Nov 29 07:50:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392 (80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4)\n80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4\nSat Nov 29 07:50:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392 (80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4)\n80e673a5bff86672a86fe57f1b95c3f0377baa1f797bd8342aae2117f54757b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.455 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce167522-84ee-4332-82aa-df4ef6d6a55c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.456 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapecf9dfa5-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.457 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 kernel: tapecf9dfa5-30: left promiscuous mode
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.468 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.472 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[1c63f47c-5eca-44d5-9b4b-85363f991464]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.488 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[81889076-217c-49f0-9e1e-cbc617123443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.490 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e378e7fd-fd87-4a5b-891b-8d4bfdd2bc11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.506 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[e68a2438-223a-44e0-9df0-b9222629dfee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 805399, 'reachable_time': 32714, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249826, 'error': None, 'target': 'ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.510 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ecf9dfa5-3869-4794-b3ab-c2616cd59392 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:50:20 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:20.510 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[390faca3-05e9-4c21-9165-a7e296ec29f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539504 systemd[1]: run-netns-ovnmeta\x2decf9dfa5\x2d3869\x2d4794\x2db3ab\x2dc2616cd59392.mount: Deactivated successfully.
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.523 187156 INFO nova.compute.manager [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.523 187156 DEBUG oslo.service.loopingcall [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.524 187156 DEBUG nova.compute.manager [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:50:20 np0005539504 nova_compute[187152]: 2025-11-29 07:50:20.524 187156 DEBUG nova.network.neutron [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.043 187156 DEBUG nova.network.neutron [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updated VIF entry in instance network info cache for port 16299523-f464-4264-a7c7-83d865feeef5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.045 187156 DEBUG nova.network.neutron [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updating instance_info_cache with network_info: [{"id": "16299523-f464-4264-a7c7-83d865feeef5", "address": "fa:16:3e:24:5c:93", "network": {"id": "ecf9dfa5-3869-4794-b3ab-c2616cd59392", "bridge": "br-int", "label": "tempest-network-smoke--556648500", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16299523-f4", "ovs_interfaceid": "16299523-f464-4264-a7c7-83d865feeef5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.093 187156 DEBUG oslo_concurrency.lockutils [req-ff015c80-e7e8-46b0-a7ad-f5752d18629e req-b491a3f4-677d-4639-9306-60bde9054bf8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-72880d73-5d71-455b-9281-b171fb1d024f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.447 187156 DEBUG nova.network.neutron [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.463 187156 DEBUG nova.compute.manager [req-ba2fdc2f-79b8-475f-9884-159fea004c6f req-23b406c0-28dd-43d1-89c7-32d52261fd3d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.463 187156 DEBUG oslo_concurrency.lockutils [req-ba2fdc2f-79b8-475f-9884-159fea004c6f req-23b406c0-28dd-43d1-89c7-32d52261fd3d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "72880d73-5d71-455b-9281-b171fb1d024f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.464 187156 DEBUG oslo_concurrency.lockutils [req-ba2fdc2f-79b8-475f-9884-159fea004c6f req-23b406c0-28dd-43d1-89c7-32d52261fd3d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.464 187156 DEBUG oslo_concurrency.lockutils [req-ba2fdc2f-79b8-475f-9884-159fea004c6f req-23b406c0-28dd-43d1-89c7-32d52261fd3d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.464 187156 DEBUG nova.compute.manager [req-ba2fdc2f-79b8-475f-9884-159fea004c6f req-23b406c0-28dd-43d1-89c7-32d52261fd3d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] No waiting events found dispatching network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.464 187156 WARNING nova.compute.manager [req-ba2fdc2f-79b8-475f-9884-159fea004c6f req-23b406c0-28dd-43d1-89c7-32d52261fd3d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received unexpected event network-vif-plugged-16299523-f464-4264-a7c7-83d865feeef5 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.465 187156 INFO nova.compute.manager [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Took 1.94 seconds to deallocate network for instance.#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.545 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.546 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.601 187156 DEBUG nova.compute.provider_tree [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.615 187156 DEBUG nova.scheduler.client.report [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.638 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.672 187156 INFO nova.scheduler.client.report [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 72880d73-5d71-455b-9281-b171fb1d024f#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.747 187156 DEBUG oslo_concurrency.lockutils [None req-a7df2a87-2fd1-4374-9f07-5bfe5ca7047e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "72880d73-5d71-455b-9281-b171fb1d024f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:22 np0005539504 nova_compute[187152]: 2025-11-29 07:50:22.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:23 np0005539504 nova_compute[187152]: 2025-11-29 07:50:23.365 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:23.487 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:23.487 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:23.487 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:24 np0005539504 nova_compute[187152]: 2025-11-29 07:50:24.443 187156 DEBUG nova.compute.manager [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Received event network-vif-deleted-16299523-f464-4264-a7c7-83d865feeef5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:25 np0005539504 nova_compute[187152]: 2025-11-29 07:50:25.437 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:27 np0005539504 nova_compute[187152]: 2025-11-29 07:50:27.980 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:28 np0005539504 nova_compute[187152]: 2025-11-29 07:50:28.230 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:28 np0005539504 nova_compute[187152]: 2025-11-29 07:50:28.369 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:29.029 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:50:29 np0005539504 nova_compute[187152]: 2025-11-29 07:50:29.029 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:29 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:29.030 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:50:30 np0005539504 nova_compute[187152]: 2025-11-29 07:50:30.472 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:31 np0005539504 podman[249834]: 2025-11-29 07:50:31.731416363 +0000 UTC m=+0.066729432 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 29 02:50:31 np0005539504 podman[249833]: 2025-11-29 07:50:31.731377302 +0000 UTC m=+0.066783843 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:50:31 np0005539504 podman[249835]: 2025-11-29 07:50:31.736406588 +0000 UTC m=+0.064484312 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:50:32 np0005539504 nova_compute[187152]: 2025-11-29 07:50:32.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:32 np0005539504 nova_compute[187152]: 2025-11-29 07:50:32.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:32 np0005539504 nova_compute[187152]: 2025-11-29 07:50:32.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:50:33 np0005539504 nova_compute[187152]: 2025-11-29 07:50:33.371 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:35 np0005539504 nova_compute[187152]: 2025-11-29 07:50:35.410 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402620.4098961, 72880d73-5d71-455b-9281-b171fb1d024f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:50:35 np0005539504 nova_compute[187152]: 2025-11-29 07:50:35.411 187156 INFO nova.compute.manager [-] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:50:35 np0005539504 nova_compute[187152]: 2025-11-29 07:50:35.429 187156 DEBUG nova.compute.manager [None req-52c93723-dfcb-443e-8f55-d059acc1cc13 - - - - - -] [instance: 72880d73-5d71-455b-9281-b171fb1d024f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:35 np0005539504 nova_compute[187152]: 2025-11-29 07:50:35.475 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:36 np0005539504 podman[249893]: 2025-11-29 07:50:36.704706192 +0000 UTC m=+0.047738400 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:50:36 np0005539504 podman[249894]: 2025-11-29 07:50:36.744346133 +0000 UTC m=+0.082439567 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:50:36 np0005539504 nova_compute[187152]: 2025-11-29 07:50:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:36 np0005539504 nova_compute[187152]: 2025-11-29 07:50:36.960 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:36 np0005539504 nova_compute[187152]: 2025-11-29 07:50:36.960 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:36 np0005539504 nova_compute[187152]: 2025-11-29 07:50:36.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:36 np0005539504 nova_compute[187152]: 2025-11-29 07:50:36.961 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.115 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.116 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5708MB free_disk=73.00622177124023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.116 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.167 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.168 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.447 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.461 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.484 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:50:37 np0005539504 nova_compute[187152]: 2025-11-29 07:50:37.484 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:38 np0005539504 nova_compute[187152]: 2025-11-29 07:50:38.372 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:39 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:39.032 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.485 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.486 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.486 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.500 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.621 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.622 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.622 187156 INFO nova.compute.manager [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Unshelving#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.849 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.850 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.855 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'pci_requests' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.871 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'numa_topology' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.889 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.890 187156 INFO nova.compute.claims [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 29 02:50:39 np0005539504 nova_compute[187152]: 2025-11-29 07:50:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:40 np0005539504 nova_compute[187152]: 2025-11-29 07:50:40.014 187156 DEBUG nova.compute.provider_tree [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:50:40 np0005539504 nova_compute[187152]: 2025-11-29 07:50:40.033 187156 DEBUG nova.scheduler.client.report [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:50:40 np0005539504 nova_compute[187152]: 2025-11-29 07:50:40.060 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:40 np0005539504 nova_compute[187152]: 2025-11-29 07:50:40.479 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:40 np0005539504 nova_compute[187152]: 2025-11-29 07:50:40.815 187156 INFO nova.network.neutron [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating port 9ff644c0-307e-470e-add6-ceb7d6a15833 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.603 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.603 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.604 187156 DEBUG nova.network.neutron [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.730 187156 DEBUG nova.compute.manager [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.731 187156 DEBUG nova.compute.manager [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing instance network info cache due to event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.731 187156 DEBUG oslo_concurrency.lockutils [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:50:41 np0005539504 nova_compute[187152]: 2025-11-29 07:50:41.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:43 np0005539504 nova_compute[187152]: 2025-11-29 07:50:43.374 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.037 187156 DEBUG nova.network.neutron [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.136 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.138 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.138 187156 INFO nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Creating image(s)#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.139 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.140 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.140 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.141 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'trusted_certs' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.142 187156 DEBUG oslo_concurrency.lockutils [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.143 187156 DEBUG nova.network.neutron [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.242 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:44 np0005539504 nova_compute[187152]: 2025-11-29 07:50:44.243 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:44 np0005539504 podman[249944]: 2025-11-29 07:50:44.746196658 +0000 UTC m=+0.093049924 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:50:46 np0005539504 nova_compute[187152]: 2025-11-29 07:50:46.394 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:50:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539504 nova_compute[187152]: 2025-11-29 07:50:48.376 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:48 np0005539504 nova_compute[187152]: 2025-11-29 07:50:48.490 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:48 np0005539504 nova_compute[187152]: 2025-11-29 07:50:48.549 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.part --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:48 np0005539504 nova_compute[187152]: 2025-11-29 07:50:48.550 187156 DEBUG nova.virt.images [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] 46c68a90-c681-4942-bd16-74b36275e924 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 02:50:48 np0005539504 nova_compute[187152]: 2025-11-29 07:50:48.551 187156 DEBUG nova.privsep.utils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:50:48 np0005539504 nova_compute[187152]: 2025-11-29 07:50:48.551 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.part /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.268 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.part /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.converted" returned: 0 in 0.717s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.282 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.361 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758.converted --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.363 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 5.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.379 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.434 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.435 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.436 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.448 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.502 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.504 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758,backing_fmt=raw /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.860 187156 DEBUG nova.network.neutron [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updated VIF entry in instance network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.861 187156 DEBUG nova.network.neutron [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.960 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758,backing_fmt=raw /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk 1073741824" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.960 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.961 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:49 np0005539504 nova_compute[187152]: 2025-11-29 07:50:49.996 187156 DEBUG oslo_concurrency.lockutils [req-6d5fe86e-d39a-43f3-984b-b9cc1772c97a req-d2efdb08-910f-49c9-b604-2cc8b7c3eb4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.015 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.016 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'migration_context' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.081 187156 INFO nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Rebasing disk image.#033[00m
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.082 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.142 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.143 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:50 np0005539504 podman[249998]: 2025-11-29 07:50:50.755781954 +0000 UTC m=+0.081470372 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:50:50 np0005539504 nova_compute[187152]: 2025-11-29 07:50:50.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:51 np0005539504 nova_compute[187152]: 2025-11-29 07:50:51.443 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.307 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk" returned: 0 in 2.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.308 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.308 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Ensure instance console log exists: /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.309 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.309 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.310 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.312 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Start _get_guest_xml network_info=[{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='4fb379ffde31ecb1be65c08f7911adee',container_format='bare',created_at=2025-11-29T07:50:18Z,direct_url=<?>,disk_format='qcow2',id=46c68a90-c681-4942-bd16-74b36275e924,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-2098184144-shelved',owner='10520ccc4be44f138c8dd72b1d5edabe',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-29T07:50:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'boot_index': 0, 'encrypted': False, 'guest_format': None, 'size': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.317 187156 WARNING nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.324 187156 DEBUG nova.virt.libvirt.host [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.326 187156 DEBUG nova.virt.libvirt.host [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.329 187156 DEBUG nova.virt.libvirt.host [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.331 187156 DEBUG nova.virt.libvirt.host [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.333 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.334 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='4fb379ffde31ecb1be65c08f7911adee',container_format='bare',created_at=2025-11-29T07:50:18Z,direct_url=<?>,disk_format='qcow2',id=46c68a90-c681-4942-bd16-74b36275e924,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-2098184144-shelved',owner='10520ccc4be44f138c8dd72b1d5edabe',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-29T07:50:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.334 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.335 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.335 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.335 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.335 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.335 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.336 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.336 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.336 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.336 187156 DEBUG nova.virt.hardware [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.337 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'vcpu_model' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.362 187156 DEBUG nova.virt.libvirt.vif [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:49:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2098184144',display_name='tempest-TestShelveInstance-server-2098184144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2098184144',id=175,image_ref='46c68a90-c681-4942-bd16-74b36275e924',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1313079058',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:49:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='10520ccc4be44f138c8dd72b1d5edabe',ramdisk_id='',reservation_id='r-48eolvdi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1546337875',owner_user_name='tempest-TestShelveInstance-1546337875-project-member',shelved_at='2025-11-29T07:50:28.067577',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='46c68a90-c681-4942-bd16-74b36275e924'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:50:39Z,user_data=None,user_id='607d794b09b34b829673198ba073234c',uuid=b97f1300-6668-4955-a425-98d44189860d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.363 187156 DEBUG nova.network.os_vif_util [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converting VIF {"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.364 187156 DEBUG nova.network.os_vif_util [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.365 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'pci_devices' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.379 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <uuid>b97f1300-6668-4955-a425-98d44189860d</uuid>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <name>instance-000000af</name>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <memory>131072</memory>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <vcpu>1</vcpu>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <metadata>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:name>tempest-TestShelveInstance-server-2098184144</nova:name>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:creationTime>2025-11-29 07:50:52</nova:creationTime>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:flavor name="m1.nano">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:memory>128</nova:memory>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:disk>1</nova:disk>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:swap>0</nova:swap>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      </nova:flavor>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:owner>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:user uuid="607d794b09b34b829673198ba073234c">tempest-TestShelveInstance-1546337875-project-member</nova:user>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:project uuid="10520ccc4be44f138c8dd72b1d5edabe">tempest-TestShelveInstance-1546337875</nova:project>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      </nova:owner>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:root type="image" uuid="46c68a90-c681-4942-bd16-74b36275e924"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <nova:ports>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        <nova:port uuid="9ff644c0-307e-470e-add6-ceb7d6a15833">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:        </nova:port>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      </nova:ports>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </nova:instance>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </metadata>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <sysinfo type="smbios">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <system>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <entry name="serial">b97f1300-6668-4955-a425-98d44189860d</entry>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <entry name="uuid">b97f1300-6668-4955-a425-98d44189860d</entry>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </system>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </sysinfo>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <os>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <boot dev="hd"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <smbios mode="sysinfo"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </os>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <features>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <acpi/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <apic/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <vmcoreinfo/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </features>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <clock offset="utc">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <timer name="hpet" present="no"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </clock>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <cpu mode="custom" match="exact">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <model>Nehalem</model>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </cpu>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  <devices>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <disk type="file" device="disk">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <target dev="vda" bus="virtio"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <disk type="file" device="cdrom">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <source file="/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <target dev="sda" bus="sata"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </disk>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <interface type="ethernet">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <mac address="fa:16:3e:e5:2e:ef"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <mtu size="1442"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <target dev="tap9ff644c0-30"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </interface>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <serial type="pty">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <log file="/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/console.log" append="off"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </serial>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <video>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <model type="virtio"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </video>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <input type="tablet" bus="usb"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <input type="keyboard" bus="usb"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <rng model="virtio">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </rng>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <controller type="usb" index="0"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    <memballoon model="virtio">
Nov 29 02:50:52 np0005539504 nova_compute[187152]:      <stats period="10"/>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:    </memballoon>
Nov 29 02:50:52 np0005539504 nova_compute[187152]:  </devices>
Nov 29 02:50:52 np0005539504 nova_compute[187152]: </domain>
Nov 29 02:50:52 np0005539504 nova_compute[187152]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.381 187156 DEBUG nova.compute.manager [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Preparing to wait for external event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.381 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.382 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.382 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.382 187156 DEBUG nova.virt.libvirt.vif [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:49:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2098184144',display_name='tempest-TestShelveInstance-server-2098184144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2098184144',id=175,image_ref='46c68a90-c681-4942-bd16-74b36275e924',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1313079058',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:49:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='10520ccc4be44f138c8dd72b1d5edabe',ramdisk_id='',reservation_id='r-48eolvdi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1546337875',owner_user_name='tempest-TestShelveInstance-1546337875-project-member',shelved_at='2025-11-29T07:50:28.067577',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='46c68a90-c681-4942-bd16-74b36275e924'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:50:39Z,user_data=None,user_id='607d794b09b34b829673198ba073234c',uuid=b97f1300-6668-4955-a425-98d44189860d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.383 187156 DEBUG nova.network.os_vif_util [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converting VIF {"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.383 187156 DEBUG nova.network.os_vif_util [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.384 187156 DEBUG os_vif [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.384 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.385 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.385 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.389 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.390 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ff644c0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.390 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ff644c0-30, col_values=(('external_ids', {'iface-id': '9ff644c0-307e-470e-add6-ceb7d6a15833', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:2e:ef', 'vm-uuid': 'b97f1300-6668-4955-a425-98d44189860d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.430 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:52 np0005539504 NetworkManager[55210]: <info>  [1764402652.4321] manager: (tap9ff644c0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.437 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.438 187156 INFO os_vif [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30')#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.495 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.495 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.496 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] No VIF found with MAC fa:16:3e:e5:2e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.496 187156 INFO nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Using config drive#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.527 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'ec2_ids' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.564 187156 DEBUG nova.objects.instance [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'keypairs' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.977 187156 INFO nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Creating config drive at /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config#033[00m
Nov 29 02:50:52 np0005539504 nova_compute[187152]: 2025-11-29 07:50:52.983 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6ec7a3y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.113 187156 DEBUG oslo_concurrency.processutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa6ec7a3y" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:53 np0005539504 kernel: tap9ff644c0-30: entered promiscuous mode
Nov 29 02:50:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:53Z|00722|binding|INFO|Claiming lport 9ff644c0-307e-470e-add6-ceb7d6a15833 for this chassis.
Nov 29 02:50:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:53Z|00723|binding|INFO|9ff644c0-307e-470e-add6-ceb7d6a15833: Claiming fa:16:3e:e5:2e:ef 10.100.0.4
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.2053] manager: (tap9ff644c0-30): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.204 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.215 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.2217] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.2227] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.220 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 systemd-udevd[250039]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:50:53 np0005539504 systemd-machined[153423]: New machine qemu-90-instance-000000af.
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.2479] device (tap9ff644c0-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.2494] device (tap9ff644c0-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:50:53 np0005539504 systemd[1]: Started Virtual Machine qemu-90-instance-000000af.
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.417 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.432 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.555 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:2e:ef 10.100.0.4'], port_security=['fa:16:3e:e5:2e:ef 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b97f1300-6668-4955-a425-98d44189860d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10520ccc4be44f138c8dd72b1d5edabe', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d0d2b51-2297-4c99-81c6-529d5b2de4c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ef9cade-d410-4695-ad11-b5fe7020e3e8, chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=9ff644c0-307e-470e-add6-ceb7d6a15833) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.556 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 9ff644c0-307e-470e-add6-ceb7d6a15833 in datapath d7cbfb39-b4f8-4082-be26-e925bf6de50f bound to our chassis#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.558 104164 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7cbfb39-b4f8-4082-be26-e925bf6de50f#033[00m
Nov 29 02:50:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:53Z|00724|binding|INFO|Setting lport 9ff644c0-307e-470e-add6-ceb7d6a15833 ovn-installed in OVS
Nov 29 02:50:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:53Z|00725|binding|INFO|Setting lport 9ff644c0-307e-470e-add6-ceb7d6a15833 up in Southbound
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.573 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.574 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[51aae2bf-8ab2-4c8d-8567-35bef433c0f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.575 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7cbfb39-b1 in ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.577 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.579 214051 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7cbfb39-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.580 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a399f34b-d09b-461a-86bb-e8a4814bd286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.581 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6e682a-3c7a-407e-a492-0f7dbad3c4d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.599 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[12bfb4a2-a9bc-48c1-9356-603b020ac010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.619 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[074ad1e1-808b-41e5-85c4-2798a561dbf2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.663 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[a8374a6f-90a9-4b9a-8d02-b35af8bb8c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.667 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[f9381f4e-8383-4f99-9b83-1d460a421b6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.6695] manager: (tapd7cbfb39-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.702 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[f286a73b-3a03-4112-b3db-9cbb80aed55a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.705 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e2cccd-f7cd-4deb-8590-93f841a40586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.7330] device (tapd7cbfb39-b0): carrier: link connected
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.741 214065 DEBUG oslo.privsep.daemon [-] privsep: reply[b87417d3-d475-444a-830d-645f034fb02f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.762 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[62d5ba01-253b-4673-aef6-c319f2c3af2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7cbfb39-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:03:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 811661, 'reachable_time': 30114, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250075, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.779 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[722fb32f-f3e1-402e-8e46-d9cfe3908632]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:395'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 811661, 'tstamp': 811661}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250076, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.793 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[ba472960-8668-4c14-a5d6-f5ab77ddff8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7cbfb39-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:03:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 811661, 'reachable_time': 30114, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250077, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.822 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a6a927-4b76-46ac-a2c5-c27e2d120492]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.877 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[c799716b-bd2e-4eac-955d-aa0f6e04f097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.879 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7cbfb39-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.879 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.879 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7cbfb39-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:53 np0005539504 NetworkManager[55210]: <info>  [1764402653.8823] manager: (tapd7cbfb39-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Nov 29 02:50:53 np0005539504 kernel: tapd7cbfb39-b0: entered promiscuous mode
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.882 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.885 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7cbfb39-b0, col_values=(('external_ids', {'iface-id': '963b66d7-328f-4f14-a5c3-5bc0702a4524'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.886 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:50:53Z|00726|binding|INFO|Releasing lport 963b66d7-328f-4f14-a5c3-5bc0702a4524 from this chassis (sb_readonly=0)
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.887 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.887 104164 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7cbfb39-b4f8-4082-be26-e925bf6de50f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7cbfb39-b4f8-4082-be26-e925bf6de50f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.897 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[0318edc8-f220-469d-8ce9-262648b189ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:53 np0005539504 nova_compute[187152]: 2025-11-29 07:50:53.899 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.898 104164 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: global
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    log         /dev/log local0 debug
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    log-tag     haproxy-metadata-proxy-d7cbfb39-b4f8-4082-be26-e925bf6de50f
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    user        root
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    group       root
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    maxconn     1024
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    pidfile     /var/lib/neutron/external/pids/d7cbfb39-b4f8-4082-be26-e925bf6de50f.pid.haproxy
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    daemon
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: defaults
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    log global
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    mode http
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    option httplog
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    option dontlognull
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    option http-server-close
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    option forwardfor
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    retries                 3
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    timeout http-request    30s
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    timeout connect         30s
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    timeout client          32s
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    timeout server          32s
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    timeout http-keep-alive 30s
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: listen listener
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    bind 169.254.169.254:80
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]:    http-request add-header X-OVN-Network-ID d7cbfb39-b4f8-4082-be26-e925bf6de50f
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:50:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:50:53.900 104164 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'env', 'PROCESS_TAG=haproxy-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7cbfb39-b4f8-4082-be26-e925bf6de50f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.045 187156 DEBUG nova.compute.manager [req-f7c914d5-08a4-4f0e-8da0-b0c9ae94abd8 req-bee61f5b-656d-4414-9506-9f53f8ee47cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.046 187156 DEBUG oslo_concurrency.lockutils [req-f7c914d5-08a4-4f0e-8da0-b0c9ae94abd8 req-bee61f5b-656d-4414-9506-9f53f8ee47cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.047 187156 DEBUG oslo_concurrency.lockutils [req-f7c914d5-08a4-4f0e-8da0-b0c9ae94abd8 req-bee61f5b-656d-4414-9506-9f53f8ee47cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.047 187156 DEBUG oslo_concurrency.lockutils [req-f7c914d5-08a4-4f0e-8da0-b0c9ae94abd8 req-bee61f5b-656d-4414-9506-9f53f8ee47cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.052 187156 DEBUG nova.compute.manager [req-f7c914d5-08a4-4f0e-8da0-b0c9ae94abd8 req-bee61f5b-656d-4414-9506-9f53f8ee47cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Processing event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.091 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402654.0910926, b97f1300-6668-4955-a425-98d44189860d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.094 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Started (Lifecycle Event)#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.099 187156 DEBUG nova.compute.manager [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.103 187156 DEBUG nova.virt.libvirt.driver [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.107 187156 INFO nova.virt.libvirt.driver [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance spawned successfully.#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.245 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.248 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.378 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.380 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402654.0920854, b97f1300-6668-4955-a425-98d44189860d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.380 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.435 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.439 187156 DEBUG nova.virt.driver [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] Emitting event <LifecycleEvent: 1764402654.1025202, b97f1300-6668-4955-a425-98d44189860d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.439 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:50:54 np0005539504 podman[250116]: 2025-11-29 07:50:54.399282871 +0000 UTC m=+0.034653267 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.584 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.588 187156 DEBUG nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:50:54 np0005539504 podman[250116]: 2025-11-29 07:50:54.784597894 +0000 UTC m=+0.419968270 container create b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.805 187156 INFO nova.compute.manager [None req-463a19d6-5116-4d0e-8095-204f3d2b7744 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:50:54 np0005539504 systemd[1]: Started libpod-conmon-b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac.scope.
Nov 29 02:50:54 np0005539504 systemd[1]: Started libcrun container.
Nov 29 02:50:54 np0005539504 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/821ce725d9c312fbfe35f8a081e90182ca84797a5f12def76f56cd0f351d9e21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:50:54 np0005539504 nova_compute[187152]: 2025-11-29 07:50:54.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:54 np0005539504 podman[250116]: 2025-11-29 07:50:54.940111122 +0000 UTC m=+0.575481548 container init b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:50:54 np0005539504 podman[250116]: 2025-11-29 07:50:54.947902983 +0000 UTC m=+0.583273379 container start b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:50:54 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [NOTICE]   (250136) : New worker (250138) forked
Nov 29 02:50:54 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [NOTICE]   (250136) : Loading success.
Nov 29 02:50:55 np0005539504 nova_compute[187152]: 2025-11-29 07:50:55.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:55 np0005539504 nova_compute[187152]: 2025-11-29 07:50:55.584 187156 DEBUG nova.compute.manager [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.312 187156 DEBUG nova.compute.manager [req-64cb3d26-47f4-4b3c-9509-88bd348ecad3 req-738fbd92-aa3d-45aa-9b8e-d9bc79edb7b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.314 187156 DEBUG oslo_concurrency.lockutils [req-64cb3d26-47f4-4b3c-9509-88bd348ecad3 req-738fbd92-aa3d-45aa-9b8e-d9bc79edb7b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.314 187156 DEBUG oslo_concurrency.lockutils [req-64cb3d26-47f4-4b3c-9509-88bd348ecad3 req-738fbd92-aa3d-45aa-9b8e-d9bc79edb7b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.315 187156 DEBUG oslo_concurrency.lockutils [req-64cb3d26-47f4-4b3c-9509-88bd348ecad3 req-738fbd92-aa3d-45aa-9b8e-d9bc79edb7b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.315 187156 DEBUG nova.compute.manager [req-64cb3d26-47f4-4b3c-9509-88bd348ecad3 req-738fbd92-aa3d-45aa-9b8e-d9bc79edb7b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] No waiting events found dispatching network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.315 187156 WARNING nova.compute.manager [req-64cb3d26-47f4-4b3c-9509-88bd348ecad3 req-738fbd92-aa3d-45aa-9b8e-d9bc79edb7b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received unexpected event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:50:56 np0005539504 nova_compute[187152]: 2025-11-29 07:50:56.754 187156 DEBUG oslo_concurrency.lockutils [None req-343dd300-e59b-43bf-8447-e568ebef4547 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 17.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:57 np0005539504 nova_compute[187152]: 2025-11-29 07:50:57.432 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:58 np0005539504 nova_compute[187152]: 2025-11-29 07:50:58.419 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:02 np0005539504 nova_compute[187152]: 2025-11-29 07:51:02.465 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:02 np0005539504 podman[250147]: 2025-11-29 07:51:02.71427634 +0000 UTC m=+0.057281418 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:51:02 np0005539504 podman[250149]: 2025-11-29 07:51:02.71430199 +0000 UTC m=+0.049469247 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:51:02 np0005539504 podman[250148]: 2025-11-29 07:51:02.745102942 +0000 UTC m=+0.085681445 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 29 02:51:03 np0005539504 nova_compute[187152]: 2025-11-29 07:51:03.424 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:07 np0005539504 nova_compute[187152]: 2025-11-29 07:51:07.470 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:07 np0005539504 podman[250216]: 2025-11-29 07:51:07.710732943 +0000 UTC m=+0.053005502 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:51:07 np0005539504 ovn_controller[95182]: 2025-11-29T07:51:07Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:2e:ef 10.100.0.4
Nov 29 02:51:07 np0005539504 podman[250217]: 2025-11-29 07:51:07.761278728 +0000 UTC m=+0.099182609 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:51:08 np0005539504 nova_compute[187152]: 2025-11-29 07:51:08.425 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:12 np0005539504 nova_compute[187152]: 2025-11-29 07:51:12.473 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:13 np0005539504 nova_compute[187152]: 2025-11-29 07:51:13.426 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:15 np0005539504 podman[250261]: 2025-11-29 07:51:15.747318759 +0000 UTC m=+0.090306129 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 02:51:17 np0005539504 nova_compute[187152]: 2025-11-29 07:51:17.475 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.300 187156 DEBUG nova.compute.manager [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.301 187156 DEBUG nova.compute.manager [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing instance network info cache due to event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.301 187156 DEBUG oslo_concurrency.lockutils [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.301 187156 DEBUG oslo_concurrency.lockutils [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.302 187156 DEBUG nova.network.neutron [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.391 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.392 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.393 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.393 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.393 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.404 187156 INFO nova.compute.manager [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Terminating instance#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.414 187156 DEBUG nova.compute.manager [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.429 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 kernel: tap9ff644c0-30 (unregistering): left promiscuous mode
Nov 29 02:51:18 np0005539504 NetworkManager[55210]: <info>  [1764402678.4410] device (tap9ff644c0-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.447 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:51:18Z|00727|binding|INFO|Releasing lport 9ff644c0-307e-470e-add6-ceb7d6a15833 from this chassis (sb_readonly=0)
Nov 29 02:51:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:51:18Z|00728|binding|INFO|Setting lport 9ff644c0-307e-470e-add6-ceb7d6a15833 down in Southbound
Nov 29 02:51:18 np0005539504 ovn_controller[95182]: 2025-11-29T07:51:18Z|00729|binding|INFO|Removing iface tap9ff644c0-30 ovn-installed in OVS
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.450 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.463 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:2e:ef 10.100.0.4'], port_security=['fa:16:3e:e5:2e:ef 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b97f1300-6668-4955-a425-98d44189860d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10520ccc4be44f138c8dd72b1d5edabe', 'neutron:revision_number': '9', 'neutron:security_group_ids': '6d0d2b51-2297-4c99-81c6-529d5b2de4c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ef9cade-d410-4695-ad11-b5fe7020e3e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>], logical_port=9ff644c0-307e-470e-add6-ceb7d6a15833) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2a86368d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.465 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.465 104164 INFO neutron.agent.ovn.metadata.agent [-] Port 9ff644c0-307e-470e-add6-ceb7d6a15833 in datapath d7cbfb39-b4f8-4082-be26-e925bf6de50f unbound from our chassis#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.467 104164 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7cbfb39-b4f8-4082-be26-e925bf6de50f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.469 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[7f435eb8-342c-4662-906e-b4a84bcdac8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.469 104164 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f namespace which is not needed anymore#033[00m
Nov 29 02:51:18 np0005539504 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000af.scope: Deactivated successfully.
Nov 29 02:51:18 np0005539504 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000af.scope: Consumed 15.077s CPU time.
Nov 29 02:51:18 np0005539504 systemd-machined[153423]: Machine qemu-90-instance-000000af terminated.
Nov 29 02:51:18 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [NOTICE]   (250136) : haproxy version is 2.8.14-c23fe91
Nov 29 02:51:18 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [NOTICE]   (250136) : path to executable is /usr/sbin/haproxy
Nov 29 02:51:18 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [WARNING]  (250136) : Exiting Master process...
Nov 29 02:51:18 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [WARNING]  (250136) : Exiting Master process...
Nov 29 02:51:18 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [ALERT]    (250136) : Current worker (250138) exited with code 143 (Terminated)
Nov 29 02:51:18 np0005539504 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[250132]: [WARNING]  (250136) : All workers exited. Exiting... (0)
Nov 29 02:51:18 np0005539504 systemd[1]: libpod-b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac.scope: Deactivated successfully.
Nov 29 02:51:18 np0005539504 podman[250305]: 2025-11-29 07:51:18.636578065 +0000 UTC m=+0.057276716 container died b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:51:18 np0005539504 systemd[1]: var-lib-containers-storage-overlay-821ce725d9c312fbfe35f8a081e90182ca84797a5f12def76f56cd0f351d9e21-merged.mount: Deactivated successfully.
Nov 29 02:51:18 np0005539504 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac-userdata-shm.mount: Deactivated successfully.
Nov 29 02:51:18 np0005539504 podman[250305]: 2025-11-29 07:51:18.684040387 +0000 UTC m=+0.104739048 container cleanup b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.694 187156 INFO nova.virt.libvirt.driver [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance destroyed successfully.#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.695 187156 DEBUG nova.objects.instance [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'resources' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:51:18 np0005539504 systemd[1]: libpod-conmon-b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac.scope: Deactivated successfully.
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.716 187156 DEBUG nova.virt.libvirt.vif [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:49:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2098184144',display_name='tempest-TestShelveInstance-server-2098184144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2098184144',id=175,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOibuoQlHAfiaCYj4CIpZ6qJyfUZB3vV71n9+dsfX5nOIOEpheW32pCW+5Jb9gHkOLzyMwIqxCRE794nWAzInQpsnJoEx4IsVlc2/LJryddECLsGRYjobhhHnV47L4pCvg==',key_name='tempest-TestShelveInstance-1313079058',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:50:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='10520ccc4be44f138c8dd72b1d5edabe',ramdisk_id='',reservation_id='r-48eolvdi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1546337875',owner_user_name='tempest-TestShelveInstance-1546337875-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:50:56Z,user_data=None,user_id='607d794b09b34b829673198ba073234c',uuid=b97f1300-6668-4955-a425-98d44189860d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.716 187156 DEBUG nova.network.os_vif_util [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converting VIF {"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.717 187156 DEBUG nova.network.os_vif_util [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.718 187156 DEBUG os_vif [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.721 187156 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ff644c0-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:51:18 np0005539504 podman[250352]: 2025-11-29 07:51:18.742026303 +0000 UTC m=+0.035817139 container remove b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.769 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.772 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[978fa93b-b054-4975-bd8c-2c75cba78cb5]: (4, ('Sat Nov 29 07:51:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f (b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac)\nb0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac\nSat Nov 29 07:51:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f (b0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac)\nb0b8077404968ba7a52b51a870a025e61033954f1153d2038ce8e284ef8e13ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.774 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[4c71b714-693d-44e4-8f17-ce7d750f7615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.775 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7cbfb39-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.775 187156 INFO os_vif [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30')#033[00m
Nov 29 02:51:18 np0005539504 kernel: tapd7cbfb39-b0: left promiscuous mode
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.776 187156 INFO nova.virt.libvirt.driver [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Deleting instance files /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d_del#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.783 187156 INFO nova.virt.libvirt.driver [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Deletion of /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d_del complete#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.786 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.788 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.789 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.793 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[73ecadbb-ab44-4ce5-aa61-34506f436736]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.812 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[08511f6b-a9af-4164-925b-4f429c455c34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.813 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[5bc937b6-fd49-44ef-8a74-661d06489199]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.831 214051 DEBUG oslo.privsep.daemon [-] privsep: reply[18686ca5-886d-43b2-8637-5bbf6e9d91db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 811653, 'reachable_time': 31258, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250367, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.834 104274 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:51:18 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:18.834 104274 DEBUG oslo.privsep.daemon [-] privsep: reply[f576a510-0ba2-4159-a6de-32b3a31b8788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:51:18 np0005539504 systemd[1]: run-netns-ovnmeta\x2dd7cbfb39\x2db4f8\x2d4082\x2dbe26\x2de925bf6de50f.mount: Deactivated successfully.
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.855 187156 INFO nova.compute.manager [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.855 187156 DEBUG oslo.service.loopingcall [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.855 187156 DEBUG nova.compute.manager [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.856 187156 DEBUG nova.network.neutron [-] [instance: b97f1300-6668-4955-a425-98d44189860d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.892 187156 DEBUG nova.compute.manager [req-e7a053a5-e415-4267-91b0-14afad2f8816 req-d7dce8bc-c1c9-46f6-ab43-cda06c0c501a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-unplugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.892 187156 DEBUG oslo_concurrency.lockutils [req-e7a053a5-e415-4267-91b0-14afad2f8816 req-d7dce8bc-c1c9-46f6-ab43-cda06c0c501a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.892 187156 DEBUG oslo_concurrency.lockutils [req-e7a053a5-e415-4267-91b0-14afad2f8816 req-d7dce8bc-c1c9-46f6-ab43-cda06c0c501a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.893 187156 DEBUG oslo_concurrency.lockutils [req-e7a053a5-e415-4267-91b0-14afad2f8816 req-d7dce8bc-c1c9-46f6-ab43-cda06c0c501a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.893 187156 DEBUG nova.compute.manager [req-e7a053a5-e415-4267-91b0-14afad2f8816 req-d7dce8bc-c1c9-46f6-ab43-cda06c0c501a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] No waiting events found dispatching network-vif-unplugged-9ff644c0-307e-470e-add6-ceb7d6a15833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.893 187156 DEBUG nova.compute.manager [req-e7a053a5-e415-4267-91b0-14afad2f8816 req-d7dce8bc-c1c9-46f6-ab43-cda06c0c501a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-unplugged-9ff644c0-307e-470e-add6-ceb7d6a15833 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:51:18 np0005539504 nova_compute[187152]: 2025-11-29 07:51:18.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:21 np0005539504 podman[250368]: 2025-11-29 07:51:21.75512489 +0000 UTC m=+0.102023356 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.426 187156 DEBUG nova.compute.manager [req-93cf4ad3-d77f-4bfe-bd3e-786248d9b2de req-2af04802-2c05-4fa0-8bcd-25bd32036915 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.427 187156 DEBUG oslo_concurrency.lockutils [req-93cf4ad3-d77f-4bfe-bd3e-786248d9b2de req-2af04802-2c05-4fa0-8bcd-25bd32036915 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.427 187156 DEBUG oslo_concurrency.lockutils [req-93cf4ad3-d77f-4bfe-bd3e-786248d9b2de req-2af04802-2c05-4fa0-8bcd-25bd32036915 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.427 187156 DEBUG oslo_concurrency.lockutils [req-93cf4ad3-d77f-4bfe-bd3e-786248d9b2de req-2af04802-2c05-4fa0-8bcd-25bd32036915 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.427 187156 DEBUG nova.compute.manager [req-93cf4ad3-d77f-4bfe-bd3e-786248d9b2de req-2af04802-2c05-4fa0-8bcd-25bd32036915 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] No waiting events found dispatching network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.428 187156 WARNING nova.compute.manager [req-93cf4ad3-d77f-4bfe-bd3e-786248d9b2de req-2af04802-2c05-4fa0-8bcd-25bd32036915 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received unexpected event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.464 187156 DEBUG nova.network.neutron [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.487 187156 INFO nova.compute.manager [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Took 3.63 seconds to deallocate network for instance.#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.572 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.573 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.720 187156 DEBUG nova.compute.provider_tree [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.734 187156 DEBUG nova.scheduler.client.report [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.752 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.772 187156 DEBUG nova.network.neutron [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updated VIF entry in instance network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.772 187156 DEBUG nova.network.neutron [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.778 187156 INFO nova.scheduler.client.report [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Deleted allocations for instance b97f1300-6668-4955-a425-98d44189860d#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.793 187156 DEBUG oslo_concurrency.lockutils [req-da7bfdac-7610-475d-aabd-5329faaa84c8 req-5d609321-6835-4727-8400-ab43822e3399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.866 187156 DEBUG oslo_concurrency.lockutils [None req-985e302d-0925-442c-ab37-a96f83d61d86 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:22 np0005539504 nova_compute[187152]: 2025-11-29 07:51:22.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:23 np0005539504 nova_compute[187152]: 2025-11-29 07:51:23.015 187156 DEBUG nova.compute.manager [req-0d3eb4a1-1918-438a-aba1-54b426b08efa req-91da4a2a-e7a8-4661-a442-d37c4b1aa4ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-deleted-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:51:23 np0005539504 nova_compute[187152]: 2025-11-29 07:51:23.015 187156 INFO nova.compute.manager [req-0d3eb4a1-1918-438a-aba1-54b426b08efa req-91da4a2a-e7a8-4661-a442-d37c4b1aa4ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Neutron deleted interface 9ff644c0-307e-470e-add6-ceb7d6a15833; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:51:23 np0005539504 nova_compute[187152]: 2025-11-29 07:51:23.016 187156 DEBUG nova.network.neutron [req-0d3eb4a1-1918-438a-aba1-54b426b08efa req-91da4a2a-e7a8-4661-a442-d37c4b1aa4ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 29 02:51:23 np0005539504 nova_compute[187152]: 2025-11-29 07:51:23.018 187156 DEBUG nova.compute.manager [req-0d3eb4a1-1918-438a-aba1-54b426b08efa req-91da4a2a-e7a8-4661-a442-d37c4b1aa4ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Detach interface failed, port_id=9ff644c0-307e-470e-add6-ceb7d6a15833, reason: Instance b97f1300-6668-4955-a425-98d44189860d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:51:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:23.488 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:23.489 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:23.489 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:23 np0005539504 nova_compute[187152]: 2025-11-29 07:51:23.587 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:23 np0005539504 nova_compute[187152]: 2025-11-29 07:51:23.768 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:28 np0005539504 nova_compute[187152]: 2025-11-29 07:51:28.590 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:28 np0005539504 nova_compute[187152]: 2025-11-29 07:51:28.770 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:28 np0005539504 nova_compute[187152]: 2025-11-29 07:51:28.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:29 np0005539504 nova_compute[187152]: 2025-11-29 07:51:29.234 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:32 np0005539504 nova_compute[187152]: 2025-11-29 07:51:32.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.604 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.691 187156 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402678.6902645, b97f1300-6668-4955-a425-98d44189860d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.692 187156 INFO nova.compute.manager [-] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:51:33 np0005539504 podman[250392]: 2025-11-29 07:51:33.71266363 +0000 UTC m=+0.056225709 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.716 187156 DEBUG nova.compute.manager [None req-6d988bf0-54ba-4c31-a22f-076746da8730 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:51:33 np0005539504 podman[250394]: 2025-11-29 07:51:33.752462084 +0000 UTC m=+0.077753039 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 02:51:33 np0005539504 podman[250393]: 2025-11-29 07:51:33.752730842 +0000 UTC m=+0.080957136 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.772 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:33 np0005539504 nova_compute[187152]: 2025-11-29 07:51:33.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:51:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:34.806 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:51:34 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:34.806 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:51:34 np0005539504 nova_compute[187152]: 2025-11-29 07:51:34.807 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:51:36.808 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:51:36 np0005539504 nova_compute[187152]: 2025-11-29 07:51:36.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:36 np0005539504 nova_compute[187152]: 2025-11-29 07:51:36.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:36 np0005539504 nova_compute[187152]: 2025-11-29 07:51:36.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:36 np0005539504 nova_compute[187152]: 2025-11-29 07:51:36.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:36 np0005539504 nova_compute[187152]: 2025-11-29 07:51:36.963 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.121 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.122 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=72.93882751464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.122 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.122 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.182 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.183 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.197 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.236 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.237 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.254 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.274 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.295 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.307 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.327 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:51:37 np0005539504 nova_compute[187152]: 2025-11-29 07:51:37.328 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:38 np0005539504 nova_compute[187152]: 2025-11-29 07:51:38.606 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:38 np0005539504 podman[250460]: 2025-11-29 07:51:38.712459934 +0000 UTC m=+0.060490023 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:51:38 np0005539504 podman[250461]: 2025-11-29 07:51:38.73746737 +0000 UTC m=+0.081936974 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:51:38 np0005539504 nova_compute[187152]: 2025-11-29 07:51:38.774 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:41 np0005539504 nova_compute[187152]: 2025-11-29 07:51:41.328 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:41 np0005539504 nova_compute[187152]: 2025-11-29 07:51:41.328 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:51:41 np0005539504 nova_compute[187152]: 2025-11-29 07:51:41.328 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:51:41 np0005539504 nova_compute[187152]: 2025-11-29 07:51:41.345 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:51:41 np0005539504 nova_compute[187152]: 2025-11-29 07:51:41.345 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:43 np0005539504 nova_compute[187152]: 2025-11-29 07:51:43.608 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:43 np0005539504 nova_compute[187152]: 2025-11-29 07:51:43.776 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:43 np0005539504 nova_compute[187152]: 2025-11-29 07:51:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:46 np0005539504 podman[250506]: 2025-11-29 07:51:46.745008267 +0000 UTC m=+0.067967137 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:51:48 np0005539504 nova_compute[187152]: 2025-11-29 07:51:48.610 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:48 np0005539504 nova_compute[187152]: 2025-11-29 07:51:48.812 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:52 np0005539504 podman[250528]: 2025-11-29 07:51:52.735291232 +0000 UTC m=+0.074957164 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:51:52 np0005539504 nova_compute[187152]: 2025-11-29 07:51:52.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:53 np0005539504 nova_compute[187152]: 2025-11-29 07:51:53.612 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:53 np0005539504 nova_compute[187152]: 2025-11-29 07:51:53.814 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:58 np0005539504 nova_compute[187152]: 2025-11-29 07:51:58.614 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:58 np0005539504 nova_compute[187152]: 2025-11-29 07:51:58.816 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:03 np0005539504 nova_compute[187152]: 2025-11-29 07:52:03.615 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:03 np0005539504 ovn_controller[95182]: 2025-11-29T07:52:03Z|00730|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 02:52:03 np0005539504 nova_compute[187152]: 2025-11-29 07:52:03.866 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:04 np0005539504 podman[250549]: 2025-11-29 07:52:04.705264308 +0000 UTC m=+0.051282315 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:52:04 np0005539504 podman[250550]: 2025-11-29 07:52:04.742665618 +0000 UTC m=+0.073183767 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, managed_by=edpm_ansible, release=1755695350, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 29 02:52:04 np0005539504 podman[250556]: 2025-11-29 07:52:04.78347731 +0000 UTC m=+0.105125769 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 29 02:52:08 np0005539504 nova_compute[187152]: 2025-11-29 07:52:08.617 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:08 np0005539504 nova_compute[187152]: 2025-11-29 07:52:08.869 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:09 np0005539504 podman[250615]: 2025-11-29 07:52:09.718770473 +0000 UTC m=+0.057025881 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:52:09 np0005539504 podman[250616]: 2025-11-29 07:52:09.778487826 +0000 UTC m=+0.116348842 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 02:52:13 np0005539504 nova_compute[187152]: 2025-11-29 07:52:13.619 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:13 np0005539504 nova_compute[187152]: 2025-11-29 07:52:13.870 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:17 np0005539504 podman[250671]: 2025-11-29 07:52:17.762083686 +0000 UTC m=+0.097114794 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 29 02:52:18 np0005539504 nova_compute[187152]: 2025-11-29 07:52:18.621 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:18 np0005539504 nova_compute[187152]: 2025-11-29 07:52:18.871 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:18 np0005539504 nova_compute[187152]: 2025-11-29 07:52:18.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:52:23.489 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:52:23.489 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:52:23.490 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:23 np0005539504 nova_compute[187152]: 2025-11-29 07:52:23.622 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:23 np0005539504 podman[250693]: 2025-11-29 07:52:23.74567704 +0000 UTC m=+0.079606901 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:52:23 np0005539504 nova_compute[187152]: 2025-11-29 07:52:23.874 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:23 np0005539504 nova_compute[187152]: 2025-11-29 07:52:23.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:28 np0005539504 nova_compute[187152]: 2025-11-29 07:52:28.624 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:28 np0005539504 nova_compute[187152]: 2025-11-29 07:52:28.876 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:33 np0005539504 nova_compute[187152]: 2025-11-29 07:52:33.626 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:33 np0005539504 nova_compute[187152]: 2025-11-29 07:52:33.877 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:34 np0005539504 nova_compute[187152]: 2025-11-29 07:52:34.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:34 np0005539504 nova_compute[187152]: 2025-11-29 07:52:34.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:34 np0005539504 nova_compute[187152]: 2025-11-29 07:52:34.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:52:35 np0005539504 podman[250718]: 2025-11-29 07:52:35.738251826 +0000 UTC m=+0.068961983 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 29 02:52:35 np0005539504 podman[250719]: 2025-11-29 07:52:35.744770112 +0000 UTC m=+0.068099230 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:52:35 np0005539504 podman[250717]: 2025-11-29 07:52:35.750796704 +0000 UTC m=+0.084443680 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:52:36 np0005539504 nova_compute[187152]: 2025-11-29 07:52:36.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:36 np0005539504 nova_compute[187152]: 2025-11-29 07:52:36.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:36 np0005539504 nova_compute[187152]: 2025-11-29 07:52:36.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:36 np0005539504 nova_compute[187152]: 2025-11-29 07:52:36.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:36 np0005539504 nova_compute[187152]: 2025-11-29 07:52:36.962 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.114 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.115 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5712MB free_disk=72.94081497192383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.115 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.115 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.169 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.169 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.194 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.211 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.212 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:52:37 np0005539504 nova_compute[187152]: 2025-11-29 07:52:37.212 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:38 np0005539504 nova_compute[187152]: 2025-11-29 07:52:38.626 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:38 np0005539504 nova_compute[187152]: 2025-11-29 07:52:38.878 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:40 np0005539504 podman[250778]: 2025-11-29 07:52:40.762537064 +0000 UTC m=+0.103733422 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:52:40 np0005539504 podman[250779]: 2025-11-29 07:52:40.792691647 +0000 UTC m=+0.115964371 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:52:41 np0005539504 nova_compute[187152]: 2025-11-29 07:52:41.213 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:41 np0005539504 nova_compute[187152]: 2025-11-29 07:52:41.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:41 np0005539504 nova_compute[187152]: 2025-11-29 07:52:41.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:52:41 np0005539504 nova_compute[187152]: 2025-11-29 07:52:41.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:52:42 np0005539504 nova_compute[187152]: 2025-11-29 07:52:42.010 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:52:43 np0005539504 nova_compute[187152]: 2025-11-29 07:52:43.628 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:43 np0005539504 nova_compute[187152]: 2025-11-29 07:52:43.880 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:44 np0005539504 nova_compute[187152]: 2025-11-29 07:52:44.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:52:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:52:48 np0005539504 nova_compute[187152]: 2025-11-29 07:52:48.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:48 np0005539504 podman[250830]: 2025-11-29 07:52:48.732908846 +0000 UTC m=+0.071822790 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 29 02:52:48 np0005539504 nova_compute[187152]: 2025-11-29 07:52:48.881 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:52:51.770 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:52:51 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:52:51.770 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:52:51 np0005539504 nova_compute[187152]: 2025-11-29 07:52:51.771 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:53 np0005539504 nova_compute[187152]: 2025-11-29 07:52:53.630 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:53 np0005539504 nova_compute[187152]: 2025-11-29 07:52:53.883 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:53 np0005539504 ovn_controller[95182]: 2025-11-29T07:52:53Z|00731|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 29 02:52:53 np0005539504 nova_compute[187152]: 2025-11-29 07:52:53.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:54 np0005539504 podman[250848]: 2025-11-29 07:52:54.751534056 +0000 UTC m=+0.084325667 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:52:54 np0005539504 nova_compute[187152]: 2025-11-29 07:52:54.947 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:52:55.772 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:56 np0005539504 nova_compute[187152]: 2025-11-29 07:52:56.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:58 np0005539504 nova_compute[187152]: 2025-11-29 07:52:58.631 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:58 np0005539504 nova_compute[187152]: 2025-11-29 07:52:58.884 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:03 np0005539504 nova_compute[187152]: 2025-11-29 07:53:03.631 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:03 np0005539504 nova_compute[187152]: 2025-11-29 07:53:03.886 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:05 np0005539504 nova_compute[187152]: 2025-11-29 07:53:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:05 np0005539504 nova_compute[187152]: 2025-11-29 07:53:05.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:53:06 np0005539504 podman[250868]: 2025-11-29 07:53:06.711683947 +0000 UTC m=+0.056596449 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:53:06 np0005539504 podman[250870]: 2025-11-29 07:53:06.723419673 +0000 UTC m=+0.058288974 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:53:06 np0005539504 podman[250869]: 2025-11-29 07:53:06.729507358 +0000 UTC m=+0.065837128 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Nov 29 02:53:08 np0005539504 nova_compute[187152]: 2025-11-29 07:53:08.633 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:08 np0005539504 nova_compute[187152]: 2025-11-29 07:53:08.887 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:11 np0005539504 podman[250926]: 2025-11-29 07:53:11.70400437 +0000 UTC m=+0.051809190 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:53:11 np0005539504 podman[250927]: 2025-11-29 07:53:11.747680189 +0000 UTC m=+0.089706453 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:53:13 np0005539504 nova_compute[187152]: 2025-11-29 07:53:13.634 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:13 np0005539504 nova_compute[187152]: 2025-11-29 07:53:13.889 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:18 np0005539504 nova_compute[187152]: 2025-11-29 07:53:18.635 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:18 np0005539504 nova_compute[187152]: 2025-11-29 07:53:18.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:19 np0005539504 nova_compute[187152]: 2025-11-29 07:53:19.112 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:19 np0005539504 podman[250975]: 2025-11-29 07:53:19.747568629 +0000 UTC m=+0.087830131 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:53:20 np0005539504 nova_compute[187152]: 2025-11-29 07:53:20.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:20 np0005539504 nova_compute[187152]: 2025-11-29 07:53:20.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:53:21 np0005539504 nova_compute[187152]: 2025-11-29 07:53:21.264 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:53:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:53:23.490 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:53:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:53:23.490 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:53:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:53:23.490 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:53:23 np0005539504 nova_compute[187152]: 2025-11-29 07:53:23.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:23 np0005539504 nova_compute[187152]: 2025-11-29 07:53:23.940 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:25 np0005539504 podman[250999]: 2025-11-29 07:53:25.722766138 +0000 UTC m=+0.064345598 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:53:26 np0005539504 nova_compute[187152]: 2025-11-29 07:53:26.263 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:28 np0005539504 nova_compute[187152]: 2025-11-29 07:53:28.637 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:28 np0005539504 nova_compute[187152]: 2025-11-29 07:53:28.942 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:33 np0005539504 nova_compute[187152]: 2025-11-29 07:53:33.639 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:33 np0005539504 nova_compute[187152]: 2025-11-29 07:53:33.944 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:34 np0005539504 nova_compute[187152]: 2025-11-29 07:53:34.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:34 np0005539504 nova_compute[187152]: 2025-11-29 07:53:34.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:53:35 np0005539504 nova_compute[187152]: 2025-11-29 07:53:35.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:37 np0005539504 podman[251026]: 2025-11-29 07:53:37.701401427 +0000 UTC m=+0.046669291 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:53:37 np0005539504 podman[251028]: 2025-11-29 07:53:37.717684877 +0000 UTC m=+0.053193667 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:53:37 np0005539504 podman[251027]: 2025-11-29 07:53:37.730161384 +0000 UTC m=+0.063893156 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git)
Nov 29 02:53:37 np0005539504 nova_compute[187152]: 2025-11-29 07:53:37.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:37 np0005539504 nova_compute[187152]: 2025-11-29 07:53:37.990 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:53:37 np0005539504 nova_compute[187152]: 2025-11-29 07:53:37.991 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:53:37 np0005539504 nova_compute[187152]: 2025-11-29 07:53:37.991 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:53:37 np0005539504 nova_compute[187152]: 2025-11-29 07:53:37.992 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.127 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.128 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5735MB free_disk=72.94081497192383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.129 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.129 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.494 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.494 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.531 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.641 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.648 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.650 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.650 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:53:38 np0005539504 nova_compute[187152]: 2025-11-29 07:53:38.946 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:41 np0005539504 nova_compute[187152]: 2025-11-29 07:53:41.651 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:41 np0005539504 nova_compute[187152]: 2025-11-29 07:53:41.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:41 np0005539504 nova_compute[187152]: 2025-11-29 07:53:41.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:53:41 np0005539504 nova_compute[187152]: 2025-11-29 07:53:41.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:53:41 np0005539504 nova_compute[187152]: 2025-11-29 07:53:41.983 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:53:42 np0005539504 podman[251091]: 2025-11-29 07:53:42.699927618 +0000 UTC m=+0.046441675 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:53:42 np0005539504 podman[251092]: 2025-11-29 07:53:42.737625276 +0000 UTC m=+0.080714140 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 02:53:43 np0005539504 nova_compute[187152]: 2025-11-29 07:53:43.643 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:43 np0005539504 nova_compute[187152]: 2025-11-29 07:53:43.947 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:44 np0005539504 nova_compute[187152]: 2025-11-29 07:53:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:48 np0005539504 nova_compute[187152]: 2025-11-29 07:53:48.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:48 np0005539504 nova_compute[187152]: 2025-11-29 07:53:48.949 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:50 np0005539504 podman[251142]: 2025-11-29 07:53:50.725079861 +0000 UTC m=+0.071872852 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Nov 29 02:53:53 np0005539504 nova_compute[187152]: 2025-11-29 07:53:53.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:53 np0005539504 nova_compute[187152]: 2025-11-29 07:53:53.951 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:54 np0005539504 ovn_controller[95182]: 2025-11-29T07:53:54Z|00732|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 02:53:54 np0005539504 nova_compute[187152]: 2025-11-29 07:53:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:56 np0005539504 podman[251163]: 2025-11-29 07:53:56.702278633 +0000 UTC m=+0.050391852 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:53:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:53:57.038 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:53:57 np0005539504 nova_compute[187152]: 2025-11-29 07:53:57.039 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:57 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:53:57.039 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:53:58 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:53:58.044 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:53:58 np0005539504 nova_compute[187152]: 2025-11-29 07:53:58.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:58 np0005539504 nova_compute[187152]: 2025-11-29 07:53:58.957 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:03 np0005539504 nova_compute[187152]: 2025-11-29 07:54:03.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:03 np0005539504 nova_compute[187152]: 2025-11-29 07:54:03.959 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:04 np0005539504 nova_compute[187152]: 2025-11-29 07:54:04.052 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:08 np0005539504 nova_compute[187152]: 2025-11-29 07:54:08.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:08 np0005539504 podman[251192]: 2025-11-29 07:54:08.724072456 +0000 UTC m=+0.048871020 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 02:54:08 np0005539504 podman[251187]: 2025-11-29 07:54:08.726963304 +0000 UTC m=+0.058162950 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:54:08 np0005539504 podman[251186]: 2025-11-29 07:54:08.751201558 +0000 UTC m=+0.088746936 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:54:08 np0005539504 nova_compute[187152]: 2025-11-29 07:54:08.961 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:13 np0005539504 nova_compute[187152]: 2025-11-29 07:54:13.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:13 np0005539504 podman[251250]: 2025-11-29 07:54:13.721845375 +0000 UTC m=+0.061500151 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:54:13 np0005539504 podman[251251]: 2025-11-29 07:54:13.766223503 +0000 UTC m=+0.098456939 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:54:13 np0005539504 nova_compute[187152]: 2025-11-29 07:54:13.963 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:18 np0005539504 nova_compute[187152]: 2025-11-29 07:54:18.655 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:18 np0005539504 nova_compute[187152]: 2025-11-29 07:54:18.964 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:20 np0005539504 nova_compute[187152]: 2025-11-29 07:54:20.967 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:21 np0005539504 podman[251298]: 2025-11-29 07:54:21.760950325 +0000 UTC m=+0.090915336 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:54:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:54:23.491 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:54:23.492 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:54:23.492 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:23 np0005539504 nova_compute[187152]: 2025-11-29 07:54:23.657 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:23 np0005539504 nova_compute[187152]: 2025-11-29 07:54:23.966 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:26 np0005539504 nova_compute[187152]: 2025-11-29 07:54:26.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:27 np0005539504 podman[251319]: 2025-11-29 07:54:27.706582706 +0000 UTC m=+0.052883669 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 02:54:28 np0005539504 nova_compute[187152]: 2025-11-29 07:54:28.659 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:28 np0005539504 nova_compute[187152]: 2025-11-29 07:54:28.969 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539504 nova_compute[187152]: 2025-11-29 07:54:33.664 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539504 nova_compute[187152]: 2025-11-29 07:54:33.970 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:54:36.680 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:54:36 np0005539504 nova_compute[187152]: 2025-11-29 07:54:36.681 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:36 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:54:36.681 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:54:36 np0005539504 nova_compute[187152]: 2025-11-29 07:54:36.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:36 np0005539504 nova_compute[187152]: 2025-11-29 07:54:36.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:54:37 np0005539504 nova_compute[187152]: 2025-11-29 07:54:37.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:37 np0005539504 nova_compute[187152]: 2025-11-29 07:54:37.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.073 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.073 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.073 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.074 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.207 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.208 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=72.941162109375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.208 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.208 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.377 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.378 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.404 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.417 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.419 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.419 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.666 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:38 np0005539504 nova_compute[187152]: 2025-11-29 07:54:38.972 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:39 np0005539504 podman[251343]: 2025-11-29 07:54:39.698221174 +0000 UTC m=+0.042776665 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:54:39 np0005539504 podman[251344]: 2025-11-29 07:54:39.717193947 +0000 UTC m=+0.055965762 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible)
Nov 29 02:54:39 np0005539504 podman[251345]: 2025-11-29 07:54:39.71730036 +0000 UTC m=+0.051816630 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:54:41 np0005539504 nova_compute[187152]: 2025-11-29 07:54:41.420 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:41 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:54:41.684 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:43 np0005539504 nova_compute[187152]: 2025-11-29 07:54:43.667 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:43 np0005539504 nova_compute[187152]: 2025-11-29 07:54:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:43 np0005539504 nova_compute[187152]: 2025-11-29 07:54:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:54:43 np0005539504 nova_compute[187152]: 2025-11-29 07:54:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:54:43 np0005539504 nova_compute[187152]: 2025-11-29 07:54:43.966 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:54:43 np0005539504 nova_compute[187152]: 2025-11-29 07:54:43.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:44 np0005539504 podman[251403]: 2025-11-29 07:54:44.708001529 +0000 UTC m=+0.044373950 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:54:44 np0005539504 podman[251404]: 2025-11-29 07:54:44.785499331 +0000 UTC m=+0.116551338 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:54:46 np0005539504 nova_compute[187152]: 2025-11-29 07:54:46.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:54:48 np0005539504 nova_compute[187152]: 2025-11-29 07:54:48.668 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:48 np0005539504 nova_compute[187152]: 2025-11-29 07:54:48.974 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:52 np0005539504 podman[251450]: 2025-11-29 07:54:52.734542699 +0000 UTC m=+0.078094859 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:54:53 np0005539504 nova_compute[187152]: 2025-11-29 07:54:53.669 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:54 np0005539504 nova_compute[187152]: 2025-11-29 07:54:54.015 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:54 np0005539504 nova_compute[187152]: 2025-11-29 07:54:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:58 np0005539504 nova_compute[187152]: 2025-11-29 07:54:58.672 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:58 np0005539504 podman[251474]: 2025-11-29 07:54:58.72723025 +0000 UTC m=+0.070960577 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 02:54:59 np0005539504 nova_compute[187152]: 2025-11-29 07:54:59.017 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:01 np0005539504 nova_compute[187152]: 2025-11-29 07:55:01.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:03 np0005539504 nova_compute[187152]: 2025-11-29 07:55:03.673 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:04 np0005539504 nova_compute[187152]: 2025-11-29 07:55:04.018 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:08 np0005539504 nova_compute[187152]: 2025-11-29 07:55:08.676 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:09 np0005539504 nova_compute[187152]: 2025-11-29 07:55:09.052 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:10 np0005539504 podman[251498]: 2025-11-29 07:55:10.770258106 +0000 UTC m=+0.062108007 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter)
Nov 29 02:55:10 np0005539504 podman[251497]: 2025-11-29 07:55:10.770454641 +0000 UTC m=+0.066746351 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:55:10 np0005539504 podman[251499]: 2025-11-29 07:55:10.788210131 +0000 UTC m=+0.072264342 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:55:13 np0005539504 nova_compute[187152]: 2025-11-29 07:55:13.678 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:14 np0005539504 nova_compute[187152]: 2025-11-29 07:55:14.055 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:15 np0005539504 podman[251555]: 2025-11-29 07:55:15.71329023 +0000 UTC m=+0.057606247 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:55:15 np0005539504 podman[251556]: 2025-11-29 07:55:15.766342111 +0000 UTC m=+0.108168911 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:55:18 np0005539504 nova_compute[187152]: 2025-11-29 07:55:18.679 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:19 np0005539504 nova_compute[187152]: 2025-11-29 07:55:19.058 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:21 np0005539504 nova_compute[187152]: 2025-11-29 07:55:21.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:55:23.492 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:55:23.493 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:55:23.493 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:23 np0005539504 nova_compute[187152]: 2025-11-29 07:55:23.681 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:23 np0005539504 podman[251602]: 2025-11-29 07:55:23.742626665 +0000 UTC m=+0.074045960 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:55:24 np0005539504 nova_compute[187152]: 2025-11-29 07:55:24.097 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:28 np0005539504 nova_compute[187152]: 2025-11-29 07:55:28.715 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:28 np0005539504 nova_compute[187152]: 2025-11-29 07:55:28.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:29 np0005539504 nova_compute[187152]: 2025-11-29 07:55:29.100 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:29 np0005539504 podman[251624]: 2025-11-29 07:55:29.704581235 +0000 UTC m=+0.053923757 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 02:55:33 np0005539504 nova_compute[187152]: 2025-11-29 07:55:33.717 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:34 np0005539504 nova_compute[187152]: 2025-11-29 07:55:34.100 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:34 np0005539504 nova_compute[187152]: 2025-11-29 07:55:34.765 187156 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 2.79 sec#033[00m
Nov 29 02:55:36 np0005539504 nova_compute[187152]: 2025-11-29 07:55:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:36 np0005539504 nova_compute[187152]: 2025-11-29 07:55:36.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:55:37 np0005539504 nova_compute[187152]: 2025-11-29 07:55:37.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:38 np0005539504 nova_compute[187152]: 2025-11-29 07:55:38.719 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:38 np0005539504 nova_compute[187152]: 2025-11-29 07:55:38.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:39 np0005539504 nova_compute[187152]: 2025-11-29 07:55:39.102 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.013 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.013 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.013 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.013 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.213 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.214 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5729MB free_disk=72.9412841796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.214 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.214 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.312 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.313 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.469 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.490 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.492 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:55:40 np0005539504 nova_compute[187152]: 2025-11-29 07:55:40.492 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:41 np0005539504 podman[251646]: 2025-11-29 07:55:41.187691516 +0000 UTC m=+0.062639052 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:55:41 np0005539504 podman[251648]: 2025-11-29 07:55:41.188382774 +0000 UTC m=+0.054711238 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:55:41 np0005539504 podman[251647]: 2025-11-29 07:55:41.190148342 +0000 UTC m=+0.059620380 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 29 02:55:42 np0005539504 nova_compute[187152]: 2025-11-29 07:55:42.493 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:43 np0005539504 nova_compute[187152]: 2025-11-29 07:55:43.721 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:43 np0005539504 nova_compute[187152]: 2025-11-29 07:55:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:43 np0005539504 nova_compute[187152]: 2025-11-29 07:55:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:55:43 np0005539504 nova_compute[187152]: 2025-11-29 07:55:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:55:43 np0005539504 nova_compute[187152]: 2025-11-29 07:55:43.958 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:55:44 np0005539504 nova_compute[187152]: 2025-11-29 07:55:44.104 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:55:45.047 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:55:45 np0005539504 nova_compute[187152]: 2025-11-29 07:55:45.047 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:45 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:55:45.048 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:55:46 np0005539504 podman[251710]: 2025-11-29 07:55:46.23709563 +0000 UTC m=+0.045015516 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:55:46 np0005539504 podman[251711]: 2025-11-29 07:55:46.286276147 +0000 UTC m=+0.085280773 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:55:47 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:55:47.050 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:55:47 np0005539504 nova_compute[187152]: 2025-11-29 07:55:47.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:48 np0005539504 nova_compute[187152]: 2025-11-29 07:55:48.724 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:49 np0005539504 nova_compute[187152]: 2025-11-29 07:55:49.105 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:53 np0005539504 nova_compute[187152]: 2025-11-29 07:55:53.725 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:54 np0005539504 nova_compute[187152]: 2025-11-29 07:55:54.107 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:54 np0005539504 podman[251759]: 2025-11-29 07:55:54.705956443 +0000 UTC m=+0.053123056 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:55:54 np0005539504 nova_compute[187152]: 2025-11-29 07:55:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:58 np0005539504 nova_compute[187152]: 2025-11-29 07:55:58.763 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:59 np0005539504 nova_compute[187152]: 2025-11-29 07:55:59.108 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:00 np0005539504 podman[251781]: 2025-11-29 07:56:00.720554185 +0000 UTC m=+0.064671367 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:56:03 np0005539504 nova_compute[187152]: 2025-11-29 07:56:03.817 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:04 np0005539504 nova_compute[187152]: 2025-11-29 07:56:04.110 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:08 np0005539504 nova_compute[187152]: 2025-11-29 07:56:08.904 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:09 np0005539504 nova_compute[187152]: 2025-11-29 07:56:09.112 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:11 np0005539504 podman[251800]: 2025-11-29 07:56:11.731314104 +0000 UTC m=+0.076185228 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:56:11 np0005539504 podman[251801]: 2025-11-29 07:56:11.748691913 +0000 UTC m=+0.087904664 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 29 02:56:11 np0005539504 podman[251802]: 2025-11-29 07:56:11.748503488 +0000 UTC m=+0.076762283 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:56:13 np0005539504 nova_compute[187152]: 2025-11-29 07:56:13.905 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:14 np0005539504 nova_compute[187152]: 2025-11-29 07:56:14.113 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:16 np0005539504 podman[251863]: 2025-11-29 07:56:16.719340181 +0000 UTC m=+0.062977651 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:56:16 np0005539504 podman[251864]: 2025-11-29 07:56:16.790281606 +0000 UTC m=+0.126630019 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:56:18 np0005539504 nova_compute[187152]: 2025-11-29 07:56:18.908 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:19 np0005539504 nova_compute[187152]: 2025-11-29 07:56:19.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:21 np0005539504 nova_compute[187152]: 2025-11-29 07:56:21.330 187156 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.57 sec#033[00m
Nov 29 02:56:22 np0005539504 nova_compute[187152]: 2025-11-29 07:56:22.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:56:23.494 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:56:23.495 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:56:23.495 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:23 np0005539504 nova_compute[187152]: 2025-11-29 07:56:23.912 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:24 np0005539504 nova_compute[187152]: 2025-11-29 07:56:24.159 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:56:24.571 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:56:24 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:56:24.572 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:56:24 np0005539504 nova_compute[187152]: 2025-11-29 07:56:24.716 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:25 np0005539504 podman[251917]: 2025-11-29 07:56:25.721477252 +0000 UTC m=+0.065197431 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:56:28 np0005539504 nova_compute[187152]: 2025-11-29 07:56:28.913 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:28 np0005539504 nova_compute[187152]: 2025-11-29 07:56:28.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:29 np0005539504 nova_compute[187152]: 2025-11-29 07:56:29.160 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:31 np0005539504 podman[251938]: 2025-11-29 07:56:31.708192343 +0000 UTC m=+0.055445219 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 02:56:33 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:56:33.575 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:33 np0005539504 nova_compute[187152]: 2025-11-29 07:56:33.916 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:34 np0005539504 nova_compute[187152]: 2025-11-29 07:56:34.163 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:37 np0005539504 nova_compute[187152]: 2025-11-29 07:56:37.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:37 np0005539504 nova_compute[187152]: 2025-11-29 07:56:37.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:37 np0005539504 nova_compute[187152]: 2025-11-29 07:56:37.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:56:38 np0005539504 nova_compute[187152]: 2025-11-29 07:56:38.917 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:39 np0005539504 nova_compute[187152]: 2025-11-29 07:56:39.164 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:39 np0005539504 nova_compute[187152]: 2025-11-29 07:56:39.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:39 np0005539504 nova_compute[187152]: 2025-11-29 07:56:39.965 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:39 np0005539504 nova_compute[187152]: 2025-11-29 07:56:39.965 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:39 np0005539504 nova_compute[187152]: 2025-11-29 07:56:39.966 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:39 np0005539504 nova_compute[187152]: 2025-11-29 07:56:39.966 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.121 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.122 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5740MB free_disk=72.9412841796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.122 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.122 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.308 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.309 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.343 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.427 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.428 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.462 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.491 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.533 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.553 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.555 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:56:40 np0005539504 nova_compute[187152]: 2025-11-29 07:56:40.555 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:42 np0005539504 podman[251958]: 2025-11-29 07:56:42.715613393 +0000 UTC m=+0.059389365 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:56:42 np0005539504 podman[251959]: 2025-11-29 07:56:42.724072301 +0000 UTC m=+0.063982978 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Nov 29 02:56:42 np0005539504 podman[251960]: 2025-11-29 07:56:42.748313826 +0000 UTC m=+0.082738935 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:56:43 np0005539504 nova_compute[187152]: 2025-11-29 07:56:43.556 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:43 np0005539504 nova_compute[187152]: 2025-11-29 07:56:43.920 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:43 np0005539504 nova_compute[187152]: 2025-11-29 07:56:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:43 np0005539504 nova_compute[187152]: 2025-11-29 07:56:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:56:43 np0005539504 nova_compute[187152]: 2025-11-29 07:56:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:56:43 np0005539504 nova_compute[187152]: 2025-11-29 07:56:43.952 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:56:44 np0005539504 nova_compute[187152]: 2025-11-29 07:56:44.166 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:47 np0005539504 podman[252019]: 2025-11-29 07:56:47.703474765 +0000 UTC m=+0.048721687 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:56:47 np0005539504 podman[252020]: 2025-11-29 07:56:47.743268659 +0000 UTC m=+0.087286167 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:56:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539504 nova_compute[187152]: 2025-11-29 07:56:48.922 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:49 np0005539504 nova_compute[187152]: 2025-11-29 07:56:49.168 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:49 np0005539504 nova_compute[187152]: 2025-11-29 07:56:49.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:53 np0005539504 nova_compute[187152]: 2025-11-29 07:56:53.924 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:54 np0005539504 nova_compute[187152]: 2025-11-29 07:56:54.168 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:55 np0005539504 nova_compute[187152]: 2025-11-29 07:56:55.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:56 np0005539504 podman[252069]: 2025-11-29 07:56:56.713158858 +0000 UTC m=+0.058144431 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125)
Nov 29 02:56:58 np0005539504 nova_compute[187152]: 2025-11-29 07:56:58.926 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:59 np0005539504 nova_compute[187152]: 2025-11-29 07:56:59.170 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:02 np0005539504 podman[252091]: 2025-11-29 07:57:02.699088305 +0000 UTC m=+0.048286044 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 02:57:03 np0005539504 nova_compute[187152]: 2025-11-29 07:57:03.927 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:04 np0005539504 nova_compute[187152]: 2025-11-29 07:57:04.171 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:04 np0005539504 nova_compute[187152]: 2025-11-29 07:57:04.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:08 np0005539504 nova_compute[187152]: 2025-11-29 07:57:08.931 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:09 np0005539504 nova_compute[187152]: 2025-11-29 07:57:09.173 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:13 np0005539504 podman[252113]: 2025-11-29 07:57:13.717231722 +0000 UTC m=+0.058262314 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Nov 29 02:57:13 np0005539504 podman[252112]: 2025-11-29 07:57:13.718559468 +0000 UTC m=+0.054029109 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:57:13 np0005539504 podman[252114]: 2025-11-29 07:57:13.72234307 +0000 UTC m=+0.056892337 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:57:13 np0005539504 nova_compute[187152]: 2025-11-29 07:57:13.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:14 np0005539504 nova_compute[187152]: 2025-11-29 07:57:14.174 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:18 np0005539504 podman[252174]: 2025-11-29 07:57:18.699268048 +0000 UTC m=+0.047057682 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:57:18 np0005539504 podman[252175]: 2025-11-29 07:57:18.736232905 +0000 UTC m=+0.081414799 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:57:18 np0005539504 nova_compute[187152]: 2025-11-29 07:57:18.938 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:19 np0005539504 nova_compute[187152]: 2025-11-29 07:57:19.174 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:57:22.594 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:57:22 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:57:22.594 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:57:22 np0005539504 nova_compute[187152]: 2025-11-29 07:57:22.595 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:57:23.495 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:57:23.496 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:57:23.496 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:23 np0005539504 nova_compute[187152]: 2025-11-29 07:57:23.940 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:24 np0005539504 nova_compute[187152]: 2025-11-29 07:57:24.176 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:24 np0005539504 nova_compute[187152]: 2025-11-29 07:57:24.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:26 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:57:26.596 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:57:27 np0005539504 podman[252222]: 2025-11-29 07:57:27.704119231 +0000 UTC m=+0.053080383 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:57:28 np0005539504 nova_compute[187152]: 2025-11-29 07:57:28.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:28 np0005539504 nova_compute[187152]: 2025-11-29 07:57:28.969 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:29 np0005539504 nova_compute[187152]: 2025-11-29 07:57:29.178 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:33 np0005539504 podman[252244]: 2025-11-29 07:57:33.707427229 +0000 UTC m=+0.051244055 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:57:33 np0005539504 nova_compute[187152]: 2025-11-29 07:57:33.971 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:34 np0005539504 nova_compute[187152]: 2025-11-29 07:57:34.179 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:37 np0005539504 nova_compute[187152]: 2025-11-29 07:57:37.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:37 np0005539504 nova_compute[187152]: 2025-11-29 07:57:37.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:57:38 np0005539504 nova_compute[187152]: 2025-11-29 07:57:38.972 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:39 np0005539504 nova_compute[187152]: 2025-11-29 07:57:39.181 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:39 np0005539504 nova_compute[187152]: 2025-11-29 07:57:39.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:40 np0005539504 nova_compute[187152]: 2025-11-29 07:57:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:40 np0005539504 nova_compute[187152]: 2025-11-29 07:57:40.981 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:40 np0005539504 nova_compute[187152]: 2025-11-29 07:57:40.981 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:40 np0005539504 nova_compute[187152]: 2025-11-29 07:57:40.982 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:40 np0005539504 nova_compute[187152]: 2025-11-29 07:57:40.982 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.120 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.121 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5738MB free_disk=72.94166946411133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.121 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.122 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.198 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.199 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.223 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.243 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.244 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:57:41 np0005539504 nova_compute[187152]: 2025-11-29 07:57:41.244 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:43 np0005539504 nova_compute[187152]: 2025-11-29 07:57:43.973 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:44 np0005539504 nova_compute[187152]: 2025-11-29 07:57:44.181 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:44 np0005539504 nova_compute[187152]: 2025-11-29 07:57:44.243 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:44 np0005539504 nova_compute[187152]: 2025-11-29 07:57:44.244 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:57:44 np0005539504 nova_compute[187152]: 2025-11-29 07:57:44.244 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:57:44 np0005539504 nova_compute[187152]: 2025-11-29 07:57:44.360 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:57:44 np0005539504 nova_compute[187152]: 2025-11-29 07:57:44.361 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:44 np0005539504 podman[252269]: 2025-11-29 07:57:44.71015751 +0000 UTC m=+0.050841893 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, version=9.6, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:57:44 np0005539504 podman[252270]: 2025-11-29 07:57:44.710217942 +0000 UTC m=+0.046681691 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:57:44 np0005539504 podman[252268]: 2025-11-29 07:57:44.712134163 +0000 UTC m=+0.056280350 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:57:48 np0005539504 nova_compute[187152]: 2025-11-29 07:57:48.976 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:49 np0005539504 nova_compute[187152]: 2025-11-29 07:57:49.214 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:49 np0005539504 podman[252332]: 2025-11-29 07:57:49.702116322 +0000 UTC m=+0.047388231 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:57:49 np0005539504 podman[252333]: 2025-11-29 07:57:49.740358304 +0000 UTC m=+0.080438282 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 02:57:50 np0005539504 nova_compute[187152]: 2025-11-29 07:57:50.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:53 np0005539504 nova_compute[187152]: 2025-11-29 07:57:53.978 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:54 np0005539504 nova_compute[187152]: 2025-11-29 07:57:54.215 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:56 np0005539504 nova_compute[187152]: 2025-11-29 07:57:56.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:58 np0005539504 podman[252386]: 2025-11-29 07:57:58.720168822 +0000 UTC m=+0.064459312 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:57:58 np0005539504 nova_compute[187152]: 2025-11-29 07:57:58.979 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:59 np0005539504 nova_compute[187152]: 2025-11-29 07:57:59.259 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:01 np0005539504 nova_compute[187152]: 2025-11-29 07:58:01.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:03 np0005539504 nova_compute[187152]: 2025-11-29 07:58:03.981 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:04 np0005539504 nova_compute[187152]: 2025-11-29 07:58:04.260 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:04 np0005539504 podman[252406]: 2025-11-29 07:58:04.712230875 +0000 UTC m=+0.049101556 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 02:58:08 np0005539504 nova_compute[187152]: 2025-11-29 07:58:08.982 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:09 np0005539504 nova_compute[187152]: 2025-11-29 07:58:09.262 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:13 np0005539504 nova_compute[187152]: 2025-11-29 07:58:13.986 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:14 np0005539504 nova_compute[187152]: 2025-11-29 07:58:14.264 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:15 np0005539504 podman[252428]: 2025-11-29 07:58:15.722027955 +0000 UTC m=+0.065449338 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:58:15 np0005539504 podman[252430]: 2025-11-29 07:58:15.733462884 +0000 UTC m=+0.055485810 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 02:58:15 np0005539504 podman[252429]: 2025-11-29 07:58:15.754154313 +0000 UTC m=+0.072274143 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:58:16 np0005539504 nova_compute[187152]: 2025-11-29 07:58:16.949 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:16 np0005539504 nova_compute[187152]: 2025-11-29 07:58:16.949 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:58:18 np0005539504 nova_compute[187152]: 2025-11-29 07:58:18.987 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:19 np0005539504 nova_compute[187152]: 2025-11-29 07:58:19.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:20 np0005539504 podman[252489]: 2025-11-29 07:58:20.728109029 +0000 UTC m=+0.069909258 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:58:20 np0005539504 podman[252490]: 2025-11-29 07:58:20.768723786 +0000 UTC m=+0.094960934 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:58:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:58:23.497 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:58:23.497 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:58:23.498 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:23 np0005539504 nova_compute[187152]: 2025-11-29 07:58:23.990 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:24 np0005539504 nova_compute[187152]: 2025-11-29 07:58:24.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:26 np0005539504 nova_compute[187152]: 2025-11-29 07:58:26.964 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:58:28.780 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:58:28 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:58:28.781 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:58:28 np0005539504 nova_compute[187152]: 2025-11-29 07:58:28.780 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:28 np0005539504 nova_compute[187152]: 2025-11-29 07:58:28.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:28 np0005539504 nova_compute[187152]: 2025-11-29 07:58:28.939 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:28 np0005539504 nova_compute[187152]: 2025-11-29 07:58:28.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:58:28 np0005539504 nova_compute[187152]: 2025-11-29 07:58:28.991 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:29 np0005539504 nova_compute[187152]: 2025-11-29 07:58:29.014 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:58:29 np0005539504 nova_compute[187152]: 2025-11-29 07:58:29.270 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:29 np0005539504 podman[252542]: 2025-11-29 07:58:29.7117172 +0000 UTC m=+0.056592879 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:58:30 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:58:30.783 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:34 np0005539504 nova_compute[187152]: 2025-11-29 07:58:34.026 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:34 np0005539504 nova_compute[187152]: 2025-11-29 07:58:34.271 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:35 np0005539504 podman[252564]: 2025-11-29 07:58:35.729298292 +0000 UTC m=+0.067005730 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:58:39 np0005539504 nova_compute[187152]: 2025-11-29 07:58:39.013 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:39 np0005539504 nova_compute[187152]: 2025-11-29 07:58:39.014 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:58:39 np0005539504 nova_compute[187152]: 2025-11-29 07:58:39.027 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:39 np0005539504 nova_compute[187152]: 2025-11-29 07:58:39.273 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:40 np0005539504 nova_compute[187152]: 2025-11-29 07:58:40.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:40 np0005539504 nova_compute[187152]: 2025-11-29 07:58:40.967 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:40 np0005539504 nova_compute[187152]: 2025-11-29 07:58:40.968 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:40 np0005539504 nova_compute[187152]: 2025-11-29 07:58:40.968 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:40 np0005539504 nova_compute[187152]: 2025-11-29 07:58:40.968 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.141 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.142 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5733MB free_disk=72.9415283203125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.142 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.143 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.214 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.215 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.237 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.253 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.255 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:58:41 np0005539504 nova_compute[187152]: 2025-11-29 07:58:41.256 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:42 np0005539504 nova_compute[187152]: 2025-11-29 07:58:42.251 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:43 np0005539504 nova_compute[187152]: 2025-11-29 07:58:43.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:43 np0005539504 nova_compute[187152]: 2025-11-29 07:58:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:58:43 np0005539504 nova_compute[187152]: 2025-11-29 07:58:43.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:58:43 np0005539504 nova_compute[187152]: 2025-11-29 07:58:43.953 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:58:44 np0005539504 nova_compute[187152]: 2025-11-29 07:58:44.030 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:44 np0005539504 nova_compute[187152]: 2025-11-29 07:58:44.274 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:44 np0005539504 nova_compute[187152]: 2025-11-29 07:58:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:46 np0005539504 podman[252588]: 2025-11-29 07:58:46.729195908 +0000 UTC m=+0.059838287 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:58:46 np0005539504 podman[252594]: 2025-11-29 07:58:46.732040074 +0000 UTC m=+0.047455632 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:58:46 np0005539504 podman[252587]: 2025-11-29 07:58:46.740269067 +0000 UTC m=+0.078261154 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 07:58:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:58:49 np0005539504 nova_compute[187152]: 2025-11-29 07:58:49.032 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:49 np0005539504 nova_compute[187152]: 2025-11-29 07:58:49.276 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:51 np0005539504 podman[252651]: 2025-11-29 07:58:51.738155779 +0000 UTC m=+0.069524558 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:58:51 np0005539504 podman[252652]: 2025-11-29 07:58:51.74338164 +0000 UTC m=+0.078827800 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:58:51 np0005539504 nova_compute[187152]: 2025-11-29 07:58:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:54 np0005539504 nova_compute[187152]: 2025-11-29 07:58:54.040 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:54 np0005539504 nova_compute[187152]: 2025-11-29 07:58:54.277 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:56 np0005539504 nova_compute[187152]: 2025-11-29 07:58:56.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:59 np0005539504 nova_compute[187152]: 2025-11-29 07:58:59.049 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:59 np0005539504 nova_compute[187152]: 2025-11-29 07:58:59.279 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:00 np0005539504 podman[252701]: 2025-11-29 07:59:00.760915427 +0000 UTC m=+0.098333446 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:59:04 np0005539504 nova_compute[187152]: 2025-11-29 07:59:04.054 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:04 np0005539504 nova_compute[187152]: 2025-11-29 07:59:04.279 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:06 np0005539504 podman[252723]: 2025-11-29 07:59:06.714317247 +0000 UTC m=+0.055851750 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:59:09 np0005539504 nova_compute[187152]: 2025-11-29 07:59:09.059 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:09 np0005539504 nova_compute[187152]: 2025-11-29 07:59:09.282 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:09 np0005539504 nova_compute[187152]: 2025-11-29 07:59:09.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:14 np0005539504 nova_compute[187152]: 2025-11-29 07:59:14.062 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:14 np0005539504 nova_compute[187152]: 2025-11-29 07:59:14.283 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:17 np0005539504 podman[252745]: 2025-11-29 07:59:17.746744559 +0000 UTC m=+0.082295983 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:59:17 np0005539504 podman[252743]: 2025-11-29 07:59:17.74674721 +0000 UTC m=+0.088802580 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:59:17 np0005539504 podman[252744]: 2025-11-29 07:59:17.754824577 +0000 UTC m=+0.095153619 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:59:19 np0005539504 nova_compute[187152]: 2025-11-29 07:59:19.064 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:19 np0005539504 nova_compute[187152]: 2025-11-29 07:59:19.286 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:22 np0005539504 podman[252806]: 2025-11-29 07:59:22.708403993 +0000 UTC m=+0.053628138 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:59:22 np0005539504 podman[252807]: 2025-11-29 07:59:22.79122188 +0000 UTC m=+0.130556236 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:59:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:59:23.498 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:59:23.498 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:59:23.499 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:24 np0005539504 nova_compute[187152]: 2025-11-29 07:59:24.067 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:24 np0005539504 nova_compute[187152]: 2025-11-29 07:59:24.286 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:26 np0005539504 nova_compute[187152]: 2025-11-29 07:59:26.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:29 np0005539504 nova_compute[187152]: 2025-11-29 07:59:29.068 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:29 np0005539504 nova_compute[187152]: 2025-11-29 07:59:29.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:30 np0005539504 nova_compute[187152]: 2025-11-29 07:59:30.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:31 np0005539504 podman[252856]: 2025-11-29 07:59:31.709393002 +0000 UTC m=+0.053174556 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:59:34 np0005539504 nova_compute[187152]: 2025-11-29 07:59:34.072 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:34 np0005539504 nova_compute[187152]: 2025-11-29 07:59:34.288 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:37 np0005539504 podman[252874]: 2025-11-29 07:59:37.740293496 +0000 UTC m=+0.079626721 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:59:37 np0005539504 nova_compute[187152]: 2025-11-29 07:59:37.852 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:59:37.852 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:59:37 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:59:37.853 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:59:39 np0005539504 nova_compute[187152]: 2025-11-29 07:59:39.074 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:39 np0005539504 nova_compute[187152]: 2025-11-29 07:59:39.290 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:39 np0005539504 nova_compute[187152]: 2025-11-29 07:59:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:39 np0005539504 nova_compute[187152]: 2025-11-29 07:59:39.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:59:42 np0005539504 nova_compute[187152]: 2025-11-29 07:59:42.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:42 np0005539504 nova_compute[187152]: 2025-11-29 07:59:42.963 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:42 np0005539504 nova_compute[187152]: 2025-11-29 07:59:42.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:42 np0005539504 nova_compute[187152]: 2025-11-29 07:59:42.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:42 np0005539504 nova_compute[187152]: 2025-11-29 07:59:42.964 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.097 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.098 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=72.94150924682617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.098 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.099 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.228 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.229 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.250 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.263 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.265 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:59:43 np0005539504 nova_compute[187152]: 2025-11-29 07:59:43.265 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:44 np0005539504 nova_compute[187152]: 2025-11-29 07:59:44.075 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:44 np0005539504 nova_compute[187152]: 2025-11-29 07:59:44.260 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:44 np0005539504 nova_compute[187152]: 2025-11-29 07:59:44.303 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:44 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 07:59:44.855 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:44 np0005539504 nova_compute[187152]: 2025-11-29 07:59:44.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:45 np0005539504 nova_compute[187152]: 2025-11-29 07:59:45.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:45 np0005539504 nova_compute[187152]: 2025-11-29 07:59:45.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:59:45 np0005539504 nova_compute[187152]: 2025-11-29 07:59:45.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:59:45 np0005539504 nova_compute[187152]: 2025-11-29 07:59:45.954 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:59:48 np0005539504 podman[252898]: 2025-11-29 07:59:48.711310571 +0000 UTC m=+0.049011734 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 02:59:48 np0005539504 podman[252897]: 2025-11-29 07:59:48.718552637 +0000 UTC m=+0.060189916 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Nov 29 02:59:48 np0005539504 podman[252896]: 2025-11-29 07:59:48.732569295 +0000 UTC m=+0.077640176 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:59:49 np0005539504 nova_compute[187152]: 2025-11-29 07:59:49.077 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:49 np0005539504 nova_compute[187152]: 2025-11-29 07:59:49.303 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:51 np0005539504 nova_compute[187152]: 2025-11-29 07:59:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:53 np0005539504 podman[252959]: 2025-11-29 07:59:53.705414873 +0000 UTC m=+0.051906002 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:59:53 np0005539504 podman[252960]: 2025-11-29 07:59:53.795198907 +0000 UTC m=+0.137447672 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true)
Nov 29 02:59:54 np0005539504 nova_compute[187152]: 2025-11-29 07:59:54.080 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:54 np0005539504 nova_compute[187152]: 2025-11-29 07:59:54.306 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:58 np0005539504 nova_compute[187152]: 2025-11-29 07:59:58.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:59 np0005539504 nova_compute[187152]: 2025-11-29 07:59:59.081 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:59 np0005539504 nova_compute[187152]: 2025-11-29 07:59:59.307 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:02 np0005539504 podman[253009]: 2025-11-29 08:00:02.743483284 +0000 UTC m=+0.085758976 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:00:04 np0005539504 nova_compute[187152]: 2025-11-29 08:00:04.083 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:04 np0005539504 nova_compute[187152]: 2025-11-29 08:00:04.308 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:08 np0005539504 podman[253029]: 2025-11-29 08:00:08.726460642 +0000 UTC m=+0.067568995 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 03:00:09 np0005539504 nova_compute[187152]: 2025-11-29 08:00:09.084 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:09 np0005539504 nova_compute[187152]: 2025-11-29 08:00:09.309 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:14 np0005539504 nova_compute[187152]: 2025-11-29 08:00:14.088 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:14 np0005539504 nova_compute[187152]: 2025-11-29 08:00:14.311 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:19 np0005539504 nova_compute[187152]: 2025-11-29 08:00:19.091 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:19 np0005539504 nova_compute[187152]: 2025-11-29 08:00:19.313 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:19 np0005539504 podman[253050]: 2025-11-29 08:00:19.712334196 +0000 UTC m=+0.054289446 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:00:19 np0005539504 podman[253051]: 2025-11-29 08:00:19.725338788 +0000 UTC m=+0.062726285 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 29 03:00:19 np0005539504 podman[253052]: 2025-11-29 08:00:19.738283937 +0000 UTC m=+0.065126890 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 03:00:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:00:23.499 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:00:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:00:23.500 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:00:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:00:23.500 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:00:24 np0005539504 nova_compute[187152]: 2025-11-29 08:00:24.090 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:24 np0005539504 nova_compute[187152]: 2025-11-29 08:00:24.315 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:24 np0005539504 podman[253113]: 2025-11-29 08:00:24.72061454 +0000 UTC m=+0.071353318 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:00:25 np0005539504 podman[253114]: 2025-11-29 08:00:25.210206648 +0000 UTC m=+0.557417630 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 03:00:28 np0005539504 nova_compute[187152]: 2025-11-29 08:00:28.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:29 np0005539504 nova_compute[187152]: 2025-11-29 08:00:29.093 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:29 np0005539504 nova_compute[187152]: 2025-11-29 08:00:29.317 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:32 np0005539504 nova_compute[187152]: 2025-11-29 08:00:32.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:33 np0005539504 podman[253166]: 2025-11-29 08:00:33.703774306 +0000 UTC m=+0.049937139 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 03:00:34 np0005539504 nova_compute[187152]: 2025-11-29 08:00:34.094 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:34 np0005539504 nova_compute[187152]: 2025-11-29 08:00:34.319 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:39 np0005539504 nova_compute[187152]: 2025-11-29 08:00:39.096 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:39 np0005539504 nova_compute[187152]: 2025-11-29 08:00:39.320 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:39 np0005539504 podman[253189]: 2025-11-29 08:00:39.50470344 +0000 UTC m=+0.081702347 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 03:00:39 np0005539504 nova_compute[187152]: 2025-11-29 08:00:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:39 np0005539504 nova_compute[187152]: 2025-11-29 08:00:39.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:00:43 np0005539504 nova_compute[187152]: 2025-11-29 08:00:43.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:43 np0005539504 nova_compute[187152]: 2025-11-29 08:00:43.935 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:43 np0005539504 nova_compute[187152]: 2025-11-29 08:00:43.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:00:43 np0005539504 nova_compute[187152]: 2025-11-29 08:00:43.961 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:00:43 np0005539504 nova_compute[187152]: 2025-11-29 08:00:43.962 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:00:43 np0005539504 nova_compute[187152]: 2025-11-29 08:00:43.962 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.097 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.138 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.139 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5748MB free_disk=72.94150924682617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.139 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.140 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.213 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.213 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.240 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.255 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.257 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.257 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:00:44 np0005539504 nova_compute[187152]: 2025-11-29 08:00:44.321 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:46 np0005539504 nova_compute[187152]: 2025-11-29 08:00:46.258 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:47 np0005539504 nova_compute[187152]: 2025-11-29 08:00:47.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:47 np0005539504 nova_compute[187152]: 2025-11-29 08:00:47.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:00:47 np0005539504 nova_compute[187152]: 2025-11-29 08:00:47.939 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:00:47 np0005539504 nova_compute[187152]: 2025-11-29 08:00:47.971 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:00:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:00:49 np0005539504 nova_compute[187152]: 2025-11-29 08:00:49.100 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:49 np0005539504 nova_compute[187152]: 2025-11-29 08:00:49.322 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:50 np0005539504 podman[253211]: 2025-11-29 08:00:50.723608908 +0000 UTC m=+0.064887962 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:00:50 np0005539504 podman[253209]: 2025-11-29 08:00:50.741868831 +0000 UTC m=+0.080858264 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:00:50 np0005539504 podman[253210]: 2025-11-29 08:00:50.742001865 +0000 UTC m=+0.077462982 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git)
Nov 29 03:00:51 np0005539504 nova_compute[187152]: 2025-11-29 08:00:51.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:00:53.189 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:00:53 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:00:53.189 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 03:00:53 np0005539504 nova_compute[187152]: 2025-11-29 08:00:53.190 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:54 np0005539504 nova_compute[187152]: 2025-11-29 08:00:54.104 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:54 np0005539504 nova_compute[187152]: 2025-11-29 08:00:54.348 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:55 np0005539504 podman[253273]: 2025-11-29 08:00:55.706118095 +0000 UTC m=+0.049911318 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:00:55 np0005539504 podman[253274]: 2025-11-29 08:00:55.755912649 +0000 UTC m=+0.092888798 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:00:59 np0005539504 nova_compute[187152]: 2025-11-29 08:00:59.107 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:59 np0005539504 nova_compute[187152]: 2025-11-29 08:00:59.349 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539504 nova_compute[187152]: 2025-11-29 08:01:00.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:01 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:01.192 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:01:04 np0005539504 nova_compute[187152]: 2025-11-29 08:01:04.109 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:04 np0005539504 nova_compute[187152]: 2025-11-29 08:01:04.354 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:04 np0005539504 podman[253338]: 2025-11-29 08:01:04.704141144 +0000 UTC m=+0.048938702 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 03:01:09 np0005539504 nova_compute[187152]: 2025-11-29 08:01:09.131 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:09 np0005539504 nova_compute[187152]: 2025-11-29 08:01:09.356 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:09 np0005539504 podman[253356]: 2025-11-29 08:01:09.71435539 +0000 UTC m=+0.056455715 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Nov 29 03:01:13 np0005539504 nova_compute[187152]: 2025-11-29 08:01:13.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:14 np0005539504 nova_compute[187152]: 2025-11-29 08:01:14.133 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:14 np0005539504 nova_compute[187152]: 2025-11-29 08:01:14.357 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:19 np0005539504 nova_compute[187152]: 2025-11-29 08:01:19.135 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:19 np0005539504 nova_compute[187152]: 2025-11-29 08:01:19.420 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:21 np0005539504 podman[253376]: 2025-11-29 08:01:21.716943408 +0000 UTC m=+0.056040554 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:21 np0005539504 podman[253378]: 2025-11-29 08:01:21.727653066 +0000 UTC m=+0.057301018 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 03:01:21 np0005539504 podman[253377]: 2025-11-29 08:01:21.750474393 +0000 UTC m=+0.072926370 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc.)
Nov 29 03:01:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:23.500 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:23.501 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:23.501 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:24 np0005539504 nova_compute[187152]: 2025-11-29 08:01:24.136 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:24 np0005539504 nova_compute[187152]: 2025-11-29 08:01:24.421 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:26 np0005539504 podman[253440]: 2025-11-29 08:01:26.70112322 +0000 UTC m=+0.045523360 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:01:26 np0005539504 podman[253441]: 2025-11-29 08:01:26.752387454 +0000 UTC m=+0.091166102 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 03:01:28 np0005539504 nova_compute[187152]: 2025-11-29 08:01:28.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:29 np0005539504 nova_compute[187152]: 2025-11-29 08:01:29.138 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:29 np0005539504 nova_compute[187152]: 2025-11-29 08:01:29.490 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:32 np0005539504 nova_compute[187152]: 2025-11-29 08:01:32.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:34 np0005539504 nova_compute[187152]: 2025-11-29 08:01:34.140 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:34 np0005539504 nova_compute[187152]: 2025-11-29 08:01:34.492 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:35 np0005539504 podman[253488]: 2025-11-29 08:01:35.71023074 +0000 UTC m=+0.054256016 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 03:01:39 np0005539504 nova_compute[187152]: 2025-11-29 08:01:39.142 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:39 np0005539504 nova_compute[187152]: 2025-11-29 08:01:39.493 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:39 np0005539504 nova_compute[187152]: 2025-11-29 08:01:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:39 np0005539504 nova_compute[187152]: 2025-11-29 08:01:39.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:01:40 np0005539504 podman[253508]: 2025-11-29 08:01:40.707962637 +0000 UTC m=+0.050980077 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.144 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.495 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.970 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.971 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.971 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:44 np0005539504 nova_compute[187152]: 2025-11-29 08:01:44.971 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.129 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.131 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=72.94150924682617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.131 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.131 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.619 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.620 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.726 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.876 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.876 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.918 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 03:01:45 np0005539504 nova_compute[187152]: 2025-11-29 08:01:45.977 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 03:01:46 np0005539504 nova_compute[187152]: 2025-11-29 08:01:46.154 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:01:46 np0005539504 nova_compute[187152]: 2025-11-29 08:01:46.192 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:01:46 np0005539504 nova_compute[187152]: 2025-11-29 08:01:46.194 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:01:46 np0005539504 nova_compute[187152]: 2025-11-29 08:01:46.194 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.147 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.195 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.195 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.195 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.270 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.270 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:49 np0005539504 nova_compute[187152]: 2025-11-29 08:01:49.496 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:51 np0005539504 nova_compute[187152]: 2025-11-29 08:01:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:52 np0005539504 podman[253531]: 2025-11-29 08:01:52.710539662 +0000 UTC m=+0.052673623 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:52 np0005539504 podman[253533]: 2025-11-29 08:01:52.712877625 +0000 UTC m=+0.050473184 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 03:01:52 np0005539504 podman[253532]: 2025-11-29 08:01:52.750506401 +0000 UTC m=+0.088086640 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 03:01:54 np0005539504 nova_compute[187152]: 2025-11-29 08:01:54.148 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:54 np0005539504 nova_compute[187152]: 2025-11-29 08:01:54.497 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:55.259 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:01:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:55.260 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 03:01:55 np0005539504 nova_compute[187152]: 2025-11-29 08:01:55.261 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:56 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:01:56.262 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:01:57 np0005539504 podman[253595]: 2025-11-29 08:01:57.704610163 +0000 UTC m=+0.051552483 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:01:57 np0005539504 podman[253596]: 2025-11-29 08:01:57.736912575 +0000 UTC m=+0.081070190 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 03:01:59 np0005539504 nova_compute[187152]: 2025-11-29 08:01:59.151 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:59 np0005539504 nova_compute[187152]: 2025-11-29 08:01:59.499 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:01 np0005539504 nova_compute[187152]: 2025-11-29 08:02:01.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:04 np0005539504 nova_compute[187152]: 2025-11-29 08:02:04.152 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:04 np0005539504 nova_compute[187152]: 2025-11-29 08:02:04.503 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:06 np0005539504 podman[253646]: 2025-11-29 08:02:06.702027404 +0000 UTC m=+0.046944399 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:02:09 np0005539504 nova_compute[187152]: 2025-11-29 08:02:09.154 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:09 np0005539504 nova_compute[187152]: 2025-11-29 08:02:09.504 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:11 np0005539504 podman[253666]: 2025-11-29 08:02:11.721594074 +0000 UTC m=+0.068200842 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Nov 29 03:02:14 np0005539504 nova_compute[187152]: 2025-11-29 08:02:14.156 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:14 np0005539504 nova_compute[187152]: 2025-11-29 08:02:14.505 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:19 np0005539504 nova_compute[187152]: 2025-11-29 08:02:19.202 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:19 np0005539504 nova_compute[187152]: 2025-11-29 08:02:19.507 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:02:23.502 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:02:23.502 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:02:23.502 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:23 np0005539504 podman[253690]: 2025-11-29 08:02:23.707269001 +0000 UTC m=+0.049244020 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 03:02:23 np0005539504 podman[253692]: 2025-11-29 08:02:23.714654841 +0000 UTC m=+0.049551799 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 03:02:23 np0005539504 podman[253691]: 2025-11-29 08:02:23.720495188 +0000 UTC m=+0.054530273 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64)
Nov 29 03:02:24 np0005539504 nova_compute[187152]: 2025-11-29 08:02:24.204 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:24 np0005539504 nova_compute[187152]: 2025-11-29 08:02:24.509 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:28 np0005539504 podman[253752]: 2025-11-29 08:02:28.746590722 +0000 UTC m=+0.088802008 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:02:28 np0005539504 podman[253753]: 2025-11-29 08:02:28.772452931 +0000 UTC m=+0.114668927 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 03:02:28 np0005539504 nova_compute[187152]: 2025-11-29 08:02:28.938 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:29 np0005539504 nova_compute[187152]: 2025-11-29 08:02:29.252 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:29 np0005539504 nova_compute[187152]: 2025-11-29 08:02:29.510 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:32 np0005539504 nova_compute[187152]: 2025-11-29 08:02:32.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:34 np0005539504 nova_compute[187152]: 2025-11-29 08:02:34.254 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:34 np0005539504 nova_compute[187152]: 2025-11-29 08:02:34.512 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:37 np0005539504 podman[253800]: 2025-11-29 08:02:37.721664869 +0000 UTC m=+0.058706646 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.256 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.514 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.937 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.938 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.940 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.940 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.940 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.941 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.980 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.981 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.982 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.982 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.983 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.983 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.984 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.984 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.985 187156 WARNING nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Unknown base file: /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.985 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Removable base files: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3 /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925 /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0 /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3 /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.986 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.986 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.986 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f719940c4cc1c4e5536b1cad4ff1056de3fb8ad3#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.987 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3fec8ad6044bc82e00912202f90e71c49ea3f925#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.987 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/1f6bbea7c12e7fd60429d8192b4eff988ab580c0#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.988 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/fa0e20a5a8d2535a22db09211fcaa6a093d1698f#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.988 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ff8b78e36cdce7b25ad93cd697b9b0303aca57f3#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.988 187156 INFO nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/4f0e6b2b01b060a9e6e0d9fbdbdaa554f51ee758#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.989 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.989 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 29 03:02:39 np0005539504 nova_compute[187152]: 2025-11-29 08:02:39.990 187156 DEBUG nova.virt.libvirt.imagecache [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 29 03:02:40 np0005539504 nova_compute[187152]: 2025-11-29 08:02:40.991 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:40 np0005539504 nova_compute[187152]: 2025-11-29 08:02:40.992 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:02:42 np0005539504 podman[253822]: 2025-11-29 08:02:42.702989195 +0000 UTC m=+0.052090957 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:02:44 np0005539504 nova_compute[187152]: 2025-11-29 08:02:44.256 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:44 np0005539504 nova_compute[187152]: 2025-11-29 08:02:44.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:46 np0005539504 nova_compute[187152]: 2025-11-29 08:02:46.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:46 np0005539504 nova_compute[187152]: 2025-11-29 08:02:46.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:46 np0005539504 nova_compute[187152]: 2025-11-29 08:02:46.963 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:46 np0005539504 nova_compute[187152]: 2025-11-29 08:02:46.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:46 np0005539504 nova_compute[187152]: 2025-11-29 08:02:46.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:46 np0005539504 nova_compute[187152]: 2025-11-29 08:02:46.964 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.100 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.101 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5758MB free_disk=72.940673828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.101 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.101 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.160 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.161 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.179 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.199 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.200 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:02:47 np0005539504 nova_compute[187152]: 2025-11-29 08:02:47.200 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:02:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.201 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.257 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.516 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:02:49 np0005539504 nova_compute[187152]: 2025-11-29 08:02:49.955 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:02:51 np0005539504 nova_compute[187152]: 2025-11-29 08:02:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:54 np0005539504 nova_compute[187152]: 2025-11-29 08:02:54.259 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:54 np0005539504 nova_compute[187152]: 2025-11-29 08:02:54.517 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:54 np0005539504 podman[253843]: 2025-11-29 08:02:54.737392 +0000 UTC m=+0.078156681 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64)
Nov 29 03:02:54 np0005539504 podman[253844]: 2025-11-29 08:02:54.756545978 +0000 UTC m=+0.081014589 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 03:02:54 np0005539504 podman[253842]: 2025-11-29 08:02:54.768653084 +0000 UTC m=+0.102294843 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:02:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:02:55.772 104164 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:02:55 np0005539504 nova_compute[187152]: 2025-11-29 08:02:55.773 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:55 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:02:55.773 104164 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 03:02:59 np0005539504 nova_compute[187152]: 2025-11-29 08:02:59.261 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:59 np0005539504 nova_compute[187152]: 2025-11-29 08:02:59.519 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:59 np0005539504 podman[253905]: 2025-11-29 08:02:59.704328586 +0000 UTC m=+0.052065486 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:02:59 np0005539504 podman[253906]: 2025-11-29 08:02:59.768081768 +0000 UTC m=+0.109795635 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 03:03:02 np0005539504 nova_compute[187152]: 2025-11-29 08:03:02.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:04 np0005539504 nova_compute[187152]: 2025-11-29 08:03:04.263 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:04 np0005539504 nova_compute[187152]: 2025-11-29 08:03:04.520 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:04 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:03:04.775 104164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a43628b3-9efd-4940-9509-686038e16aeb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:03:08 np0005539504 podman[253956]: 2025-11-29 08:03:08.408178664 +0000 UTC m=+0.050960947 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 03:03:09 np0005539504 nova_compute[187152]: 2025-11-29 08:03:09.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:09 np0005539504 nova_compute[187152]: 2025-11-29 08:03:09.521 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:13 np0005539504 podman[253978]: 2025-11-29 08:03:13.706351553 +0000 UTC m=+0.051399138 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 03:03:13 np0005539504 nova_compute[187152]: 2025-11-29 08:03:13.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:14 np0005539504 nova_compute[187152]: 2025-11-29 08:03:14.069 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:14 np0005539504 nova_compute[187152]: 2025-11-29 08:03:14.265 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:14 np0005539504 nova_compute[187152]: 2025-11-29 08:03:14.523 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:19 np0005539504 nova_compute[187152]: 2025-11-29 08:03:19.306 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:19 np0005539504 nova_compute[187152]: 2025-11-29 08:03:19.525 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:22 np0005539504 nova_compute[187152]: 2025-11-29 08:03:22.725 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:22 np0005539504 nova_compute[187152]: 2025-11-29 08:03:22.726 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 03:03:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:03:23.503 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:03:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:03:23.504 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:03:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:03:23.504 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:03:24 np0005539504 nova_compute[187152]: 2025-11-29 08:03:24.308 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:24 np0005539504 nova_compute[187152]: 2025-11-29 08:03:24.526 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:25 np0005539504 podman[254000]: 2025-11-29 08:03:25.705078015 +0000 UTC m=+0.051445700 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 03:03:25 np0005539504 podman[254002]: 2025-11-29 08:03:25.719537575 +0000 UTC m=+0.059419256 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:03:25 np0005539504 podman[254001]: 2025-11-29 08:03:25.719674369 +0000 UTC m=+0.062654503 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350)
Nov 29 03:03:29 np0005539504 nova_compute[187152]: 2025-11-29 08:03:29.309 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:29 np0005539504 nova_compute[187152]: 2025-11-29 08:03:29.527 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:29 np0005539504 podman[254062]: 2025-11-29 08:03:29.782133656 +0000 UTC m=+0.049773394 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:03:29 np0005539504 podman[254086]: 2025-11-29 08:03:29.890125072 +0000 UTC m=+0.085009116 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:03:31 np0005539504 nova_compute[187152]: 2025-11-29 08:03:31.063 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:33 np0005539504 nova_compute[187152]: 2025-11-29 08:03:33.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:34 np0005539504 nova_compute[187152]: 2025-11-29 08:03:34.310 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:34 np0005539504 nova_compute[187152]: 2025-11-29 08:03:34.529 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:38 np0005539504 podman[254113]: 2025-11-29 08:03:38.708270635 +0000 UTC m=+0.055145831 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 29 03:03:38 np0005539504 nova_compute[187152]: 2025-11-29 08:03:38.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:38 np0005539504 nova_compute[187152]: 2025-11-29 08:03:38.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 03:03:38 np0005539504 nova_compute[187152]: 2025-11-29 08:03:38.963 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 03:03:39 np0005539504 nova_compute[187152]: 2025-11-29 08:03:39.313 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:39 np0005539504 nova_compute[187152]: 2025-11-29 08:03:39.531 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:41 np0005539504 nova_compute[187152]: 2025-11-29 08:03:41.964 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:41 np0005539504 nova_compute[187152]: 2025-11-29 08:03:41.965 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:03:44 np0005539504 nova_compute[187152]: 2025-11-29 08:03:44.313 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:44 np0005539504 nova_compute[187152]: 2025-11-29 08:03:44.532 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:44 np0005539504 podman[254133]: 2025-11-29 08:03:44.706430863 +0000 UTC m=+0.050837104 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 03:03:46 np0005539504 nova_compute[187152]: 2025-11-29 08:03:46.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:46 np0005539504 nova_compute[187152]: 2025-11-29 08:03:46.968 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:03:46 np0005539504 nova_compute[187152]: 2025-11-29 08:03:46.968 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:03:46 np0005539504 nova_compute[187152]: 2025-11-29 08:03:46.969 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:03:46 np0005539504 nova_compute[187152]: 2025-11-29 08:03:46.969 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.116 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.117 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5753MB free_disk=72.9404411315918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.117 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.118 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.195 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.195 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.216 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.229 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.230 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:03:47 np0005539504 nova_compute[187152]: 2025-11-29 08:03:47.230 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:03:48 np0005539504 nova_compute[187152]: 2025-11-29 08:03:48.226 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:48 np0005539504 nova_compute[187152]: 2025-11-29 08:03:48.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:49 np0005539504 nova_compute[187152]: 2025-11-29 08:03:49.316 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:49 np0005539504 nova_compute[187152]: 2025-11-29 08:03:49.533 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:51 np0005539504 nova_compute[187152]: 2025-11-29 08:03:51.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:51 np0005539504 nova_compute[187152]: 2025-11-29 08:03:51.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:03:51 np0005539504 nova_compute[187152]: 2025-11-29 08:03:51.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:03:51 np0005539504 nova_compute[187152]: 2025-11-29 08:03:51.951 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:03:52 np0005539504 nova_compute[187152]: 2025-11-29 08:03:52.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:54 np0005539504 nova_compute[187152]: 2025-11-29 08:03:54.318 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:54 np0005539504 nova_compute[187152]: 2025-11-29 08:03:54.534 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:56 np0005539504 podman[254153]: 2025-11-29 08:03:56.700858448 +0000 UTC m=+0.043902136 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:03:56 np0005539504 podman[254155]: 2025-11-29 08:03:56.709912743 +0000 UTC m=+0.046307682 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 03:03:56 np0005539504 podman[254154]: 2025-11-29 08:03:56.725200035 +0000 UTC m=+0.064386310 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 29 03:03:59 np0005539504 nova_compute[187152]: 2025-11-29 08:03:59.348 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:59 np0005539504 nova_compute[187152]: 2025-11-29 08:03:59.535 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:00 np0005539504 podman[254215]: 2025-11-29 08:04:00.703120282 +0000 UTC m=+0.047727721 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 03:04:00 np0005539504 podman[254216]: 2025-11-29 08:04:00.743628574 +0000 UTC m=+0.085917940 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 03:04:04 np0005539504 nova_compute[187152]: 2025-11-29 08:04:04.350 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:04 np0005539504 nova_compute[187152]: 2025-11-29 08:04:04.537 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:04 np0005539504 nova_compute[187152]: 2025-11-29 08:04:04.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:09 np0005539504 nova_compute[187152]: 2025-11-29 08:04:09.351 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:09 np0005539504 nova_compute[187152]: 2025-11-29 08:04:09.538 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:09 np0005539504 podman[254265]: 2025-11-29 08:04:09.712228739 +0000 UTC m=+0.051971264 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 03:04:14 np0005539504 nova_compute[187152]: 2025-11-29 08:04:14.353 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:14 np0005539504 nova_compute[187152]: 2025-11-29 08:04:14.539 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:15 np0005539504 podman[254285]: 2025-11-29 08:04:15.707445537 +0000 UTC m=+0.050356310 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 03:04:17 np0005539504 nova_compute[187152]: 2025-11-29 08:04:17.053 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:19 np0005539504 nova_compute[187152]: 2025-11-29 08:04:19.354 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:19 np0005539504 nova_compute[187152]: 2025-11-29 08:04:19.542 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:04:23.504 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:04:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:04:23.505 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:04:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:04:23.505 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:24 np0005539504 nova_compute[187152]: 2025-11-29 08:04:24.355 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:24 np0005539504 nova_compute[187152]: 2025-11-29 08:04:24.543 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:27 np0005539504 podman[254306]: 2025-11-29 08:04:27.709255862 +0000 UTC m=+0.057217826 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:04:27 np0005539504 podman[254308]: 2025-11-29 08:04:27.718265235 +0000 UTC m=+0.058686045 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 03:04:27 np0005539504 podman[254307]: 2025-11-29 08:04:27.724177374 +0000 UTC m=+0.066600238 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 03:04:29 np0005539504 nova_compute[187152]: 2025-11-29 08:04:29.357 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:29 np0005539504 nova_compute[187152]: 2025-11-29 08:04:29.544 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:30 np0005539504 nova_compute[187152]: 2025-11-29 08:04:30.955 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:31 np0005539504 podman[254367]: 2025-11-29 08:04:31.734534257 +0000 UTC m=+0.078090949 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 03:04:31 np0005539504 podman[254368]: 2025-11-29 08:04:31.751278669 +0000 UTC m=+0.092054326 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 03:04:34 np0005539504 nova_compute[187152]: 2025-11-29 08:04:34.391 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:34 np0005539504 nova_compute[187152]: 2025-11-29 08:04:34.546 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:35 np0005539504 nova_compute[187152]: 2025-11-29 08:04:35.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:39 np0005539504 nova_compute[187152]: 2025-11-29 08:04:39.392 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:39 np0005539504 nova_compute[187152]: 2025-11-29 08:04:39.547 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:40 np0005539504 podman[254417]: 2025-11-29 08:04:40.714322274 +0000 UTC m=+0.060358201 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 03:04:43 np0005539504 nova_compute[187152]: 2025-11-29 08:04:43.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:43 np0005539504 nova_compute[187152]: 2025-11-29 08:04:43.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:04:44 np0005539504 nova_compute[187152]: 2025-11-29 08:04:44.393 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:44 np0005539504 nova_compute[187152]: 2025-11-29 08:04:44.548 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:46 np0005539504 podman[254442]: 2025-11-29 08:04:46.558595126 +0000 UTC m=+0.053024452 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 03:04:47 np0005539504 nova_compute[187152]: 2025-11-29 08:04:47.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:04:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539504 nova_compute[187152]: 2025-11-29 08:04:48.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.085 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.086 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.086 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.086 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.249 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.250 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5748MB free_disk=72.9404411315918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.250 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.250 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.396 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.549 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.758 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.758 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.776 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.798 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.799 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:04:49 np0005539504 nova_compute[187152]: 2025-11-29 08:04:49.800 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:50 np0005539504 nova_compute[187152]: 2025-11-29 08:04:50.800 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:53 np0005539504 nova_compute[187152]: 2025-11-29 08:04:53.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:53 np0005539504 nova_compute[187152]: 2025-11-29 08:04:53.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:04:53 np0005539504 nova_compute[187152]: 2025-11-29 08:04:53.938 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:04:54 np0005539504 nova_compute[187152]: 2025-11-29 08:04:54.161 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:04:54 np0005539504 nova_compute[187152]: 2025-11-29 08:04:54.400 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:54 np0005539504 nova_compute[187152]: 2025-11-29 08:04:54.551 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:54 np0005539504 nova_compute[187152]: 2025-11-29 08:04:54.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:58 np0005539504 podman[254465]: 2025-11-29 08:04:58.721329726 +0000 UTC m=+0.068411108 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 03:04:58 np0005539504 podman[254466]: 2025-11-29 08:04:58.727589314 +0000 UTC m=+0.065016635 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 29 03:04:58 np0005539504 podman[254467]: 2025-11-29 08:04:58.72779122 +0000 UTC m=+0.062186920 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 03:04:59 np0005539504 nova_compute[187152]: 2025-11-29 08:04:59.401 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:59 np0005539504 nova_compute[187152]: 2025-11-29 08:04:59.552 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:02 np0005539504 podman[254528]: 2025-11-29 08:05:02.709435087 +0000 UTC m=+0.050277239 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:05:02 np0005539504 podman[254529]: 2025-11-29 08:05:02.737669749 +0000 UTC m=+0.076028294 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 03:05:04 np0005539504 nova_compute[187152]: 2025-11-29 08:05:04.403 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:04 np0005539504 nova_compute[187152]: 2025-11-29 08:05:04.576 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:05 np0005539504 nova_compute[187152]: 2025-11-29 08:05:05.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:09 np0005539504 nova_compute[187152]: 2025-11-29 08:05:09.404 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:09 np0005539504 nova_compute[187152]: 2025-11-29 08:05:09.577 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:11 np0005539504 podman[254577]: 2025-11-29 08:05:11.700495478 +0000 UTC m=+0.048939012 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 03:05:14 np0005539504 nova_compute[187152]: 2025-11-29 08:05:14.406 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:14 np0005539504 nova_compute[187152]: 2025-11-29 08:05:14.606 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:16 np0005539504 podman[254597]: 2025-11-29 08:05:16.705293268 +0000 UTC m=+0.054619556 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 03:05:16 np0005539504 nova_compute[187152]: 2025-11-29 08:05:16.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:19 np0005539504 nova_compute[187152]: 2025-11-29 08:05:19.417 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:19 np0005539504 nova_compute[187152]: 2025-11-29 08:05:19.607 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:05:23.513 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:05:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:05:23.513 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:05:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:05:23.513 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:05:24 np0005539504 nova_compute[187152]: 2025-11-29 08:05:24.419 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:24 np0005539504 nova_compute[187152]: 2025-11-29 08:05:24.609 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:29 np0005539504 nova_compute[187152]: 2025-11-29 08:05:29.422 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:29 np0005539504 nova_compute[187152]: 2025-11-29 08:05:29.611 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:29 np0005539504 podman[254621]: 2025-11-29 08:05:29.711786226 +0000 UTC m=+0.055248023 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=)
Nov 29 03:05:29 np0005539504 podman[254620]: 2025-11-29 08:05:29.734357425 +0000 UTC m=+0.081062349 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:05:29 np0005539504 podman[254622]: 2025-11-29 08:05:29.735278699 +0000 UTC m=+0.076088994 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 03:05:32 np0005539504 nova_compute[187152]: 2025-11-29 08:05:32.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:33 np0005539504 podman[254678]: 2025-11-29 08:05:33.702222519 +0000 UTC m=+0.050266098 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 03:05:33 np0005539504 podman[254679]: 2025-11-29 08:05:33.735293421 +0000 UTC m=+0.077629356 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 03:05:34 np0005539504 nova_compute[187152]: 2025-11-29 08:05:34.425 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:34 np0005539504 nova_compute[187152]: 2025-11-29 08:05:34.612 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:37 np0005539504 nova_compute[187152]: 2025-11-29 08:05:37.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:39 np0005539504 nova_compute[187152]: 2025-11-29 08:05:39.427 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:39 np0005539504 nova_compute[187152]: 2025-11-29 08:05:39.615 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:42 np0005539504 podman[254725]: 2025-11-29 08:05:42.705200861 +0000 UTC m=+0.054688996 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:05:44 np0005539504 nova_compute[187152]: 2025-11-29 08:05:44.428 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:44 np0005539504 nova_compute[187152]: 2025-11-29 08:05:44.616 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:45 np0005539504 nova_compute[187152]: 2025-11-29 08:05:45.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:45 np0005539504 nova_compute[187152]: 2025-11-29 08:05:45.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:05:47 np0005539504 podman[254745]: 2025-11-29 08:05:47.733269209 +0000 UTC m=+0.077125033 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:05:47 np0005539504 nova_compute[187152]: 2025-11-29 08:05:47.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:49 np0005539504 nova_compute[187152]: 2025-11-29 08:05:49.429 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:49 np0005539504 nova_compute[187152]: 2025-11-29 08:05:49.617 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:49 np0005539504 nova_compute[187152]: 2025-11-29 08:05:49.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:50 np0005539504 nova_compute[187152]: 2025-11-29 08:05:50.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:50 np0005539504 nova_compute[187152]: 2025-11-29 08:05:50.987 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:05:50 np0005539504 nova_compute[187152]: 2025-11-29 08:05:50.987 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:05:50 np0005539504 nova_compute[187152]: 2025-11-29 08:05:50.987 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:05:50 np0005539504 nova_compute[187152]: 2025-11-29 08:05:50.988 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.118 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.119 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5754MB free_disk=72.9404411315918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.119 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.120 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.375 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.376 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.405 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.455 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.457 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:05:51 np0005539504 nova_compute[187152]: 2025-11-29 08:05:51.457 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:05:54 np0005539504 nova_compute[187152]: 2025-11-29 08:05:54.431 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:54 np0005539504 nova_compute[187152]: 2025-11-29 08:05:54.619 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:55 np0005539504 nova_compute[187152]: 2025-11-29 08:05:55.457 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:55 np0005539504 nova_compute[187152]: 2025-11-29 08:05:55.457 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:05:55 np0005539504 nova_compute[187152]: 2025-11-29 08:05:55.458 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:05:55 np0005539504 nova_compute[187152]: 2025-11-29 08:05:55.473 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:05:56 np0005539504 nova_compute[187152]: 2025-11-29 08:05:56.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:59 np0005539504 nova_compute[187152]: 2025-11-29 08:05:59.433 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:59 np0005539504 nova_compute[187152]: 2025-11-29 08:05:59.619 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:00 np0005539504 podman[254768]: 2025-11-29 08:06:00.709261304 +0000 UTC m=+0.052232911 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 03:06:00 np0005539504 podman[254770]: 2025-11-29 08:06:00.741954347 +0000 UTC m=+0.077285618 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:06:00 np0005539504 podman[254769]: 2025-11-29 08:06:00.742426949 +0000 UTC m=+0.080993327 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 03:06:04 np0005539504 nova_compute[187152]: 2025-11-29 08:06:04.434 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:04 np0005539504 nova_compute[187152]: 2025-11-29 08:06:04.622 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:04 np0005539504 podman[254830]: 2025-11-29 08:06:04.709246037 +0000 UTC m=+0.057244327 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:06:04 np0005539504 podman[254831]: 2025-11-29 08:06:04.742650459 +0000 UTC m=+0.085516810 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 29 03:06:05 np0005539504 nova_compute[187152]: 2025-11-29 08:06:05.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:09 np0005539504 nova_compute[187152]: 2025-11-29 08:06:09.436 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:09 np0005539504 nova_compute[187152]: 2025-11-29 08:06:09.623 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:13 np0005539504 podman[254879]: 2025-11-29 08:06:13.739610342 +0000 UTC m=+0.078460419 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:06:14 np0005539504 nova_compute[187152]: 2025-11-29 08:06:14.437 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:14 np0005539504 nova_compute[187152]: 2025-11-29 08:06:14.625 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:18 np0005539504 podman[254900]: 2025-11-29 08:06:18.748080961 +0000 UTC m=+0.069274711 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 29 03:06:19 np0005539504 nova_compute[187152]: 2025-11-29 08:06:19.439 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:19 np0005539504 nova_compute[187152]: 2025-11-29 08:06:19.626 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:06:23.515 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:06:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:06:23.515 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:06:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:06:23.515 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:06:24 np0005539504 nova_compute[187152]: 2025-11-29 08:06:24.443 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:24 np0005539504 nova_compute[187152]: 2025-11-29 08:06:24.627 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:29 np0005539504 nova_compute[187152]: 2025-11-29 08:06:29.448 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:29 np0005539504 nova_compute[187152]: 2025-11-29 08:06:29.627 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:31 np0005539504 podman[254920]: 2025-11-29 08:06:31.705881997 +0000 UTC m=+0.051103948 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:06:31 np0005539504 podman[254922]: 2025-11-29 08:06:31.706067111 +0000 UTC m=+0.044659172 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 03:06:31 np0005539504 podman[254921]: 2025-11-29 08:06:31.708672052 +0000 UTC m=+0.049972426 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 03:06:34 np0005539504 nova_compute[187152]: 2025-11-29 08:06:34.454 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:34 np0005539504 nova_compute[187152]: 2025-11-29 08:06:34.629 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:34 np0005539504 nova_compute[187152]: 2025-11-29 08:06:34.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:35 np0005539504 podman[254979]: 2025-11-29 08:06:35.714091358 +0000 UTC m=+0.058918829 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:06:35 np0005539504 podman[254980]: 2025-11-29 08:06:35.767308082 +0000 UTC m=+0.096817498 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:06:37 np0005539504 nova_compute[187152]: 2025-11-29 08:06:37.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:39 np0005539504 nova_compute[187152]: 2025-11-29 08:06:39.456 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:39 np0005539504 nova_compute[187152]: 2025-11-29 08:06:39.631 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:44 np0005539504 nova_compute[187152]: 2025-11-29 08:06:44.459 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:44 np0005539504 nova_compute[187152]: 2025-11-29 08:06:44.632 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:44 np0005539504 podman[255029]: 2025-11-29 08:06:44.735999142 +0000 UTC m=+0.082314294 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 03:06:46 np0005539504 nova_compute[187152]: 2025-11-29 08:06:46.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:46 np0005539504 nova_compute[187152]: 2025-11-29 08:06:46.937 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:06:47 np0005539504 nova_compute[187152]: 2025-11-29 08:06:47.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:47 np0005539504 ceilometer_agent_compute[197907]: 2025-11-29 08:06:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:49 np0005539504 nova_compute[187152]: 2025-11-29 08:06:49.488 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:49 np0005539504 nova_compute[187152]: 2025-11-29 08:06:49.634 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:49 np0005539504 podman[255049]: 2025-11-29 08:06:49.74703194 +0000 UTC m=+0.082330024 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 03:06:50 np0005539504 nova_compute[187152]: 2025-11-29 08:06:50.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:52 np0005539504 nova_compute[187152]: 2025-11-29 08:06:52.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:52 np0005539504 nova_compute[187152]: 2025-11-29 08:06:52.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:06:52 np0005539504 nova_compute[187152]: 2025-11-29 08:06:52.964 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:06:52 np0005539504 nova_compute[187152]: 2025-11-29 08:06:52.965 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:06:52 np0005539504 nova_compute[187152]: 2025-11-29 08:06:52.965 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.101 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.102 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=72.94033813476562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.102 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.102 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.262 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.263 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.418 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing inventories for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.569 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating ProviderTree inventory for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.569 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Updating inventory in ProviderTree for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.597 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing aggregate associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.637 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Refreshing trait associations for resource provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.659 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.678 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.679 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:06:53 np0005539504 nova_compute[187152]: 2025-11-29 08:06:53.679 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:06:54 np0005539504 nova_compute[187152]: 2025-11-29 08:06:54.490 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:54 np0005539504 nova_compute[187152]: 2025-11-29 08:06:54.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:57 np0005539504 nova_compute[187152]: 2025-11-29 08:06:57.680 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:57 np0005539504 nova_compute[187152]: 2025-11-29 08:06:57.680 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:06:57 np0005539504 nova_compute[187152]: 2025-11-29 08:06:57.681 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:06:57 np0005539504 nova_compute[187152]: 2025-11-29 08:06:57.988 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:06:58 np0005539504 nova_compute[187152]: 2025-11-29 08:06:58.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:59 np0005539504 nova_compute[187152]: 2025-11-29 08:06:59.491 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:59 np0005539504 nova_compute[187152]: 2025-11-29 08:06:59.636 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:02 np0005539504 podman[255070]: 2025-11-29 08:07:02.702539012 +0000 UTC m=+0.051256931 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:07:02 np0005539504 podman[255071]: 2025-11-29 08:07:02.720254292 +0000 UTC m=+0.061123318 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 29 03:07:02 np0005539504 podman[255072]: 2025-11-29 08:07:02.741388985 +0000 UTC m=+0.082092146 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 03:07:04 np0005539504 nova_compute[187152]: 2025-11-29 08:07:04.494 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:04 np0005539504 nova_compute[187152]: 2025-11-29 08:07:04.637 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:06 np0005539504 podman[255133]: 2025-11-29 08:07:06.736358 +0000 UTC m=+0.079018745 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:07:06 np0005539504 podman[255134]: 2025-11-29 08:07:06.746226207 +0000 UTC m=+0.087686989 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:07:06 np0005539504 nova_compute[187152]: 2025-11-29 08:07:06.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:09 np0005539504 nova_compute[187152]: 2025-11-29 08:07:09.496 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:09 np0005539504 nova_compute[187152]: 2025-11-29 08:07:09.639 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:14 np0005539504 nova_compute[187152]: 2025-11-29 08:07:14.499 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:14 np0005539504 nova_compute[187152]: 2025-11-29 08:07:14.642 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:15 np0005539504 podman[255181]: 2025-11-29 08:07:15.700114946 +0000 UTC m=+0.048660171 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 03:07:19 np0005539504 nova_compute[187152]: 2025-11-29 08:07:19.500 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:19 np0005539504 nova_compute[187152]: 2025-11-29 08:07:19.643 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:20 np0005539504 podman[255203]: 2025-11-29 08:07:20.707582318 +0000 UTC m=+0.055091315 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 03:07:21 np0005539504 nova_compute[187152]: 2025-11-29 08:07:21.932 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:07:23.516 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:07:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:07:23.517 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:07:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:07:23.517 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:07:24 np0005539504 nova_compute[187152]: 2025-11-29 08:07:24.502 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:24 np0005539504 nova_compute[187152]: 2025-11-29 08:07:24.644 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:29 np0005539504 nova_compute[187152]: 2025-11-29 08:07:29.505 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:29 np0005539504 nova_compute[187152]: 2025-11-29 08:07:29.646 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:33 np0005539504 podman[255225]: 2025-11-29 08:07:33.700121016 +0000 UTC m=+0.046829802 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:07:33 np0005539504 podman[255226]: 2025-11-29 08:07:33.705706047 +0000 UTC m=+0.049725700 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 03:07:33 np0005539504 podman[255227]: 2025-11-29 08:07:33.705739248 +0000 UTC m=+0.045761882 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 03:07:34 np0005539504 nova_compute[187152]: 2025-11-29 08:07:34.507 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:34 np0005539504 nova_compute[187152]: 2025-11-29 08:07:34.647 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:36 np0005539504 nova_compute[187152]: 2025-11-29 08:07:36.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:37 np0005539504 podman[255287]: 2025-11-29 08:07:37.697176684 +0000 UTC m=+0.045336710 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 03:07:37 np0005539504 podman[255288]: 2025-11-29 08:07:37.731320261 +0000 UTC m=+0.077205925 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 03:07:37 np0005539504 nova_compute[187152]: 2025-11-29 08:07:37.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:39 np0005539504 nova_compute[187152]: 2025-11-29 08:07:39.510 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:39 np0005539504 nova_compute[187152]: 2025-11-29 08:07:39.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:44 np0005539504 nova_compute[187152]: 2025-11-29 08:07:44.511 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:44 np0005539504 nova_compute[187152]: 2025-11-29 08:07:44.648 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:46 np0005539504 podman[255337]: 2025-11-29 08:07:46.70215583 +0000 UTC m=+0.050673115 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:07:47 np0005539504 nova_compute[187152]: 2025-11-29 08:07:47.931 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:47 np0005539504 nova_compute[187152]: 2025-11-29 08:07:47.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:47 np0005539504 nova_compute[187152]: 2025-11-29 08:07:47.936 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:07:49 np0005539504 nova_compute[187152]: 2025-11-29 08:07:49.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:49 np0005539504 nova_compute[187152]: 2025-11-29 08:07:49.650 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:51 np0005539504 podman[255359]: 2025-11-29 08:07:51.708159793 +0000 UTC m=+0.055272429 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 03:07:52 np0005539504 nova_compute[187152]: 2025-11-29 08:07:52.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:52 np0005539504 nova_compute[187152]: 2025-11-29 08:07:52.937 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:54 np0005539504 nova_compute[187152]: 2025-11-29 08:07:54.515 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:54 np0005539504 nova_compute[187152]: 2025-11-29 08:07:54.652 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.210 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.211 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.211 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.211 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.372 187156 WARNING nova.virt.libvirt.driver [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.373 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5752MB free_disk=72.94124221801758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.373 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.373 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.454 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.454 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.486 187156 DEBUG nova.compute.provider_tree [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed in ProviderTree for provider: 1c526389-06f6-4ffd-8e90-a84c6c39f0bc update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.523 187156 DEBUG nova.scheduler.client.report [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Inventory has not changed for provider 1c526389-06f6-4ffd-8e90-a84c6c39f0bc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.525 187156 DEBUG nova.compute.resource_tracker [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:07:57 np0005539504 nova_compute[187152]: 2025-11-29 08:07:57.525 187156 DEBUG oslo_concurrency.lockutils [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:07:59 np0005539504 nova_compute[187152]: 2025-11-29 08:07:59.550 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:59 np0005539504 nova_compute[187152]: 2025-11-29 08:07:59.654 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:01 np0005539504 nova_compute[187152]: 2025-11-29 08:08:01.526 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:01 np0005539504 nova_compute[187152]: 2025-11-29 08:08:01.527 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:08:01 np0005539504 nova_compute[187152]: 2025-11-29 08:08:01.527 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:08:03 np0005539504 nova_compute[187152]: 2025-11-29 08:08:03.642 187156 DEBUG nova.compute.manager [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:08:03 np0005539504 nova_compute[187152]: 2025-11-29 08:08:03.643 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:04 np0005539504 nova_compute[187152]: 2025-11-29 08:08:04.554 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:04 np0005539504 nova_compute[187152]: 2025-11-29 08:08:04.655 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:04 np0005539504 podman[255386]: 2025-11-29 08:08:04.702274383 +0000 UTC m=+0.042007480 container health_status d089ee2ee80ce2fb4c7686cbaad36589937b8d0d970051149bcb12ba2dbc35e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 03:08:04 np0005539504 podman[255385]: 2025-11-29 08:08:04.710258109 +0000 UTC m=+0.053929203 container health_status 892c8ccc6f359e9c9a57c1c4ce360d048f47fa0c96e796a78d419e213bb3cac4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64)
Nov 29 03:08:04 np0005539504 podman[255384]: 2025-11-29 08:08:04.721808033 +0000 UTC m=+0.068945892 container health_status 060f92082290e5d16250aa2f6a8bd760c0c788e568f72f989ce00a3c6cb41e2c (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:08:08 np0005539504 podman[255446]: 2025-11-29 08:08:08.716795596 +0000 UTC m=+0.060558003 container health_status 1bd24b9836b3e7dc8d2d1b07288620a4312044221eca7cd04fc489190cac516f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 03:08:08 np0005539504 podman[255447]: 2025-11-29 08:08:08.736543332 +0000 UTC m=+0.081492801 container health_status 35a64c7d22a49cb8460e3c6714b237b55945d0fa0caa2b1d2f5d72c9aa64d927 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 03:08:08 np0005539504 nova_compute[187152]: 2025-11-29 08:08:08.936 187156 DEBUG oslo_service.periodic_task [None req-0815d9f7-4708-4f89-8ac8-2bd65df2c04a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:09 np0005539504 nova_compute[187152]: 2025-11-29 08:08:09.554 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:09 np0005539504 nova_compute[187152]: 2025-11-29 08:08:09.655 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:12 np0005539504 systemd-logind[783]: New session 66 of user zuul.
Nov 29 03:08:12 np0005539504 systemd[1]: Started Session 66 of User zuul.
Nov 29 03:08:14 np0005539504 nova_compute[187152]: 2025-11-29 08:08:14.586 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:14 np0005539504 nova_compute[187152]: 2025-11-29 08:08:14.656 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:17 np0005539504 ovs-vsctl[255671]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 03:08:17 np0005539504 podman[255668]: 2025-11-29 08:08:17.721189285 +0000 UTC m=+0.058702273 container health_status 7ce28180c360137fda4c8671c2adb7b9eb20d3d1c29f75d26ab49886f82458b5 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 03:08:18 np0005539504 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 255524 (sos)
Nov 29 03:08:18 np0005539504 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 03:08:18 np0005539504 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 03:08:18 np0005539504 virtqemud[186569]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 03:08:18 np0005539504 virtqemud[186569]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 03:08:18 np0005539504 virtqemud[186569]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 03:08:19 np0005539504 nova_compute[187152]: 2025-11-29 08:08:19.617 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:19 np0005539504 nova_compute[187152]: 2025-11-29 08:08:19.659 187156 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:22 np0005539504 systemd[1]: Starting Hostname Service...
Nov 29 03:08:22 np0005539504 podman[256217]: 2025-11-29 08:08:22.080060367 +0000 UTC m=+0.074130261 container health_status 494cdfb758f900694726f1fb7402d093fd2bcee71d4fc432925fb1b63e0a1f7d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:08:22 np0005539504 systemd[1]: Started Hostname Service.
Nov 29 03:08:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:08:23.518 104164 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:08:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:08:23.519 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:08:23 np0005539504 ovn_metadata_agent[104159]: 2025-11-29 08:08:23.519 104164 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
